Rapid Convergence of the Unadjusted Langevin Algorithm: Isoperimetry Suffices

Series
High Dimensional Seminar
Time
Wednesday, October 23, 2019 - 3:00pm for 1 hour (actually 50 minutes)
Location
Skiles 006
Speaker
Andre Wibisono – Georgia Tech
Organizer
Galyna Livshyts

Sampling is a fundamental algorithmic task. Many modern applications require sampling from complicated probability distributions in high-dimensional spaces. While the setting of logconcave target distribution is well-studied, it is important to understand sampling beyond the logconcavity assumption. We study the Unadjusted Langevin Algorithm (ULA) for sampling from a probability distribution on R^n under isoperimetry conditions. We show a convergence guarantee in Kullback-Leibler (KL) divergence assuming the target distribution satisfies log-Sobolev inequality and the log density has bounded Hessian. Notably, we do not assume convexity or bounds on higher derivatives. We also show convergence guarantees in Rényi divergence assuming the limit of ULA satisfies either log-Sobolev or Poincaré inequality. Joint work with Santosh Vempala (arXiv:1903.08568).