- Series
- High-Dimensional Phenomena in Statistics and Machine Learning Seminar
- Time
- Tuesday, February 21, 2012 - 4:00pm for 1.5 hours (actually 80 minutes)
- Location
- Skyles 006
- Speaker
- Karim Lounici – Georgia Institute of Technology, School of Mathematics
- Organizer
- Karim Lounici
This presentation is based on the papers by D. Paul and I. Johnstone (2007) and V.Q. Vu and J. Lei (2012). Here is the abstract of the second paper. We study the sparse principal components analysis in the high-dimensional setting, where $p$ (the number of variables) can be much larger than $n$ (the number of observations). We prove optimal, non-aymptotics lower bounds and upper bounds on the minimax estimation error for the leading eigenvector when it belongs to an $l_q$ ball for $q\in [0,1]$. Our bound are sharp in $p$ and $n$ for all $q\in[0,1]$ over a wide class of distributions. The upper bound is obtained by analyzing the performance of $l_q$-constrained PCA. In particular, our results provide convergence rates for $l_1$-constrained PCA.