- Dissertation Defense
- Friday, April 23, 2021 - 3:00pm for 1.5 hours (actually 80 minutes)
- Ruilin Li – Georgia Institute of Technology – email@example.com
- Ruilin Li
Please Note: Meeting link: https://bluejeans.com/7708995345
Langevin dynamics-based sampling algorithms are arguably among the most widely-used Markov Chain Monte Carlo (MCMC) methods. Two main directions of the modern study of MCMC methods are (i) How to scale MCMC methods to big data applications, and (ii) Tight convergence analysis of MCMC algorithms, with explicit dependence on various characteristics of the target distribution, in a non-asymptotic manner.
This thesis continues the previous efforts in these two lines and consists of three parts. In the first part, we study stochastic gradient MCMC methods for large-scale applications. We propose a non-uniform subsampling of gradients scheme to approximately match the transition kernel of a base MCMC base with full gradient, aiming for better sample quality. The demonstration is based on underdamped Langevin dynamics.
In the second part, we consider an analog of Nesterov's accelerated algorithm in optimization for sampling. We derive a dynamics termed Hessian-Free-High-Resolution (HFHR) dynamics, from a high-resolution ordinary differential equation description of Nesterov's accelerated algorithm. We then quantify the acceleration of HFHR over underdamped Langevin dynamics at both continuous dynamics level and discrete algorithm level.
In the third part, we study a broad family of bounded, contractive-SDE-based sampling algorithms via mean-square analysis. We show how to extend the applicability of classical mean-square analysis from finite time to infinite time. Iteration complexity in the 2-Wasserstein distance is also characterized and when applied to the Langevin Monte Carlo algorithm, we obtain an improved iteration complexity bound.