- Series
- Job Candidate Talk
- Time
- Thursday, February 1, 2018 - 11:00am for 1 hour (actually 50 minutes)
- Location
- skiles 006
- Speaker
- Anderson Ye Zhang – Yale
- Organizer
- Heinrich Matzinger
The mean field variational inference is widely used in statistics and
machine learning to approximate posterior distributions. Despite its
popularity, there exist remarkably little fundamental theoretical
justifications. The success of variational inference
mainly lies in its iterative algorithm, which, to the best of our
knowledge, has never been investigated for any high-dimensional or
complex model. In this talk, we establish computational and statistical
guarantees of mean field variational inference. Using
community detection problem as a test case, we show that its iterative
algorithm has a linear convergence to the optimal statistical accuracy
within log n iterations. We are optimistic to go beyond community
detection and to understand mean field under a general
class of latent variable models. In addition, the technique we develop
can be extended to analyzing Expectation-maximization and Gibbs sampler.