Random Matrix Theory in Data Science and Statistics
3.0
creditsAverage Course Rating
A first course in random matrix theory (RMT), the study of the eigenvalues and eigenvectors of matrices with random entries that is foundational to high-dimensional statistics and data science. Topics include eigenvalue distributions of specially structured ensembles of random matrices, matrix concentration inequalities, and asymptotic and finite-sample results for broad classes of matrices. A core focus is on probability in high dimensions: concentration of measure, the geometry of high-dimensional spaces and convex sets, Gaussian measure, and sharp transitions and threshold phenomena. Applications will feature principal component analysis, network inference, randomized algorithms, optimization, neural networks, and will also be drawn from students' own interests and backgrounds. Prerequisities: Linear Algebra AND (EITHER Real Analysis, 110.405 or equivalent, OR Probability Theory I 553.720 or equivalent). In particular, familiarity with eigenvalues, eigenvectors, and singular value decompositions; formal definitions of limits, convergence, and open and closed sets; laws of large numbers, central limit theorems, and conditional expectation.
No Course Evaluations found