ML813:Topics in Dimensionality Reduction and Manifold Learning
TB = Textbook or Required reading
REF = Reference or supplemental reading
| Type | ..................... | Title |
eBook |
|
Call Number |
![]() |
B. Ghojogh, M. Crowley, F. Karray, and A. Ghodsi, Elements of Dimensionality Reduction and Manifold Learning. Cham, Switzerland: Springer Nature, 2023.
|
Springer | On Order | ||
![]() |
M. Mohri, A. Rostamizaheh, and A. Talwalkar, Foundations of Machine Learning, MIT, 2018. |
Yes | Q325.5 .M64 2018 | ||
![]() |
![]() |
Chen, Ting, et al. "A simple framework for contrastive learning of visual representations." International conference on machine learning. PMLR, 2020. A simple framework for contrastive learning of visual representations | Open Access | NA | |
![]() |
![]() |
Higgins, Irina, et al. "beta-vae: Learning basic visual concepts with a constrained variational framework." International conference on learning representations. 2016. beta-vae: Learning basic visual concepts with a constrained variational framework | NA | ||
![]() |
![]() |
Khemakhem, Ilyes, et al. "Variational autoencoders and nonlinear ica: A unifying framework." International Conference on Artificial Intelligence and Statistics. PMLR, 2020. Variational autoencoders and nonlinear ica: A unifying framework |
|
NA | |
![]() |
![]() |
Khosla, Prannay, et al. "Supervised contrastive learning." Advances in neural information processing systems 33 (2020): 18661-18673. Supervised contrastive learning | NA | ||
![]() |
![]() |
Radford, Alec, et al. "Learning transferable visual models from natural language supervision." International conference on machine learning. PMLR, 2021. Learning transferable visual models from natural language supervision |
|
NA
|
|
![]() |
![]() |
Kipf T N, Welling M. Semi-supervised classification with graph convolutional networks[J]. arXiv preprint arXiv:1609.02907, 2016 Semi-supervised classification with graph convolutional networks |
|
NA | |
![]() |
![]() |
Vaswani A, Shazeer N, Parmar N, et al. Attention is all you need[J]. Advances in neural information processing systems, 2017, 30 Attention is all you need | NA |