- Series
- Applied and Computational Mathematics Seminar
- Time
- Monday, September 16, 2019 - 1:55pm for 1 hour (actually 50 minutes)
- Location
- Skiles 005
- Speaker
- Andre Wibisono – Georgia Tech – wibisono@gatech.edu
- Organizer
- Molei Tao
Sampling is a fundamental algorithmic task. Many modern applications require sampling from complicated probability distributions in high-dimensional spaces. While the setting of logconcave target distribution is well-studied, it is important to understand sampling beyond the logconcavity assumption. We study the Unadjusted Langevin Algorithm (ULA) for sampling from a probability distribution on R^n under isoperimetry conditions. We show a convergence guarantee in Kullback-Leibler (KL) divergence assuming the target distribution satisfies log-Sobolev inequality and the log density has bounded Hessian. Notably, we do not assume convexity or bounds on higher derivatives. We also show convergence guarantees in Rényi divergence assuming the limit of ULA satisfies either log-Sobolev or Poincaré inequality. Joint work with Santosh Vempala (arXiv:1903.08568).