Thursday, December 12, 2013 - 3:05pm
1 hour (actually 50 minutes)
Motivated by the ubiquity of signal-plus-noise type models in high-dimensional statistical signal processing and machine learning, we consider the eigenvalues and eigenvectors of finite, low rank perturbations of large random matrices. Applications in mind are as diverse as radar, sonar, wireless communications, spectral clustering, bio-informatics and Gaussian mixture cluster analysis in machine learning. We provide an application-independent approach that brings into sharp focus a fundamental informational limit of high-dimensional eigen-analysis. Building on this success, we highlight the random matrix origin of this informational limit, the connection with "free" harmonic analysis and discuss how to exploit these insights to improve low-rank signal matrix denoising relative to the truncated SVD.