Generalization and sampling from the dynamics perspective

Series
Applied and Computational Mathematics Seminar
Time
Monday, February 27, 2023 - 2:00pm for 1 hour (actually 50 minutes)
Location
Skiles 005 and https://gatech.zoom.us/j/98355006347
Speaker
Prof. Nisha Chandramoorthy – GT CSE
Organizer
Molei Tao

Please Note: Speaker will present in person

In this talk, we obtain new computational insights into two classical areas of statistics: generalization and sampling. In the first part, we study generalization: the performance of a learning algorithm on unseen data. We define a notion of generalization for non-converging training with local descent approaches via the stability of loss statistics. This notion yields generalization bounds in a similar manner to classical algorithmic stability. Then, we show that more information from the training dynamics provides clues to generalization performance.   

In the second part, we discuss a new method for constructing transport maps. Transport maps are transformations between the sample space of a source (which is generally easy to sample) and a target (typically non-Gaussian) probability distribution. The new construction arises from an infinite-dimensional generalization of a Newton method to find the zero of a "score operator". We define such a score operator that gives the difference of the score -- gradient of logarithm of density -- of a transported distribution from the target score. The new construction is iterative, enjoys fast convergence under smoothness assumptions, and does not make a parametric ansatz on the transport map.