Efficient, Robust, and Agnostic Generative Modeling with Group Symmetry and Regularized Divergences

Series
Applied and Computational Mathematics Seminar
Time
Monday, November 25, 2024 - 2:00pm for 1 hour (actually 50 minutes)
Location
Skiles 005 and https://gatech.zoom.us/j/94954654170
Speaker
Ziyu Chen – University of Massachusetts Amherst – ziyuchen@umass.eduhttps://sites.google.com/view/ziyu-chen
Organizer
Wei Zhu

In this talk, I will discuss our recent theoretical advancements in generative modeling. The first part of the presentation will focus on learning distributions with symmetry. I will introduce results on the sample complexity of empirical estimations of probability divergences for group-invariant distributions, and present performance guarantees for GANs and score-based generative models that incorporate symmetry. Notably, I will offer the first quantitative comparison between data augmentation and directly embedding symmetry into models, highlighting the latter as a more fundamental approach for efficient learning. These findings underscore how incorporating symmetry into generative models can significantly enhance learning efficiency, particularly in data-limited scenarios. The second part will cover $\alpha$-divergences with Wasserstein-1 regularization. These divergences can be interpreted as $\alpha$-divergences constrained to Lipschitz test functions in their variational form. I will demonstrate how generative learning can be made agnostic to assumptions about target distributions, including those with heavy tails or low-dimensional and fractal supports, through the use of these divergences as objective functionals. I will outline the conditions for the finiteness of these divergences under minimal assumptions on the target distribution along with the variational derivatives and gradient flow formulation associated with them. This framework provides guarantees for various machine learning algorithms that optimize over this class of divergences.