Semester.ly

Johns Hopkins University | EN.600.692

Unsupervised Learning: from Big Data to Low-Dimensional Representations

3.0

credits

Average Course Rating

(3.85)

In the era of data deluge, the development of methods for discovering structure in high-dimensional data is becoming increasingly important. This course will cover state-of-the-art methods from algebraic geometry, sparse and low-rank representations, and statistical learning for modeling and clustering high-dimensional data. The first part of the course will cover methods for modeling data with a single low-dimensional subspace, such as PCA, Robust PCA, Kernel PCA, and manifold learning techniques. The second part of the course will cover methods for modeling data with multiple subspaces, such as algebraic, statistical, sparse and low-rank subspace clustering techniques. The third part of the course will cover applications of these methods in image processing, computer vision, and biomedical imaging. Requistes include Linear Algebra, Optimization, and prior exposure to Machine I.

Spring 2014

Professor: Rene Vidal

(3.85)

Many students enrol ed in this course said that the instructor presented ideas clearly and was knowledgeable of material. The course emphasized theory and there was also some practical implementation on Matlab. However, homework assignments became demanding and chal enging, and feedback was non-existent. Also, the TA was not helpful and a background in linear algebra was needed for success. Suggestions for improvement include: a better TA, a decrease of workload, a review of homework, and an availability of course materials. Prospective students should know that the course requires a lot of work that is not labeled on the syl abus.