When machine learning meets dynamics - a few examples

CDSNS Colloquium
Friday, November 12, 2021 - 1:00pm for 1 hour (actually 50 minutes)
Online via Zoom
Molei Tao – Georgia Tech – mtao@gatech.eduhttps://people.math.gatech.edu/~mtao8/
Alex Blumenthal

Please Note: Zoom link: https://us06web.zoom.us/j/83392531099?pwd=UHh2MDFMcGErbzFtMHBZTmNZQXM0dz09

This talk will report some of our progress in showing how dynamics can be a useful mathematical tool for machine learning. Three demonstrations will be given, namely, how dynamics help design (and analyze) optimization algorithms, how dynamics help quantitatively understand nontrivial observations in deep learning practices, and how deep learning can in turn help dynamics (or more broadly put, AI for sciences). More precisely, in part 1 (dynamics for algorithm): I will talk about how to add momentum to gradient descent on a class of manifolds known as Lie groups. The treatment will be based on geometric mechanics and an interplay between continuous and discrete time dynamics. It will lead to accelerated optimization. Part 2 (dynamics for understanding deep learning) will be devoted to better understanding the nontrivial effects of large learning rates. I will describe how large learning rates could deterministically lead to chaotic escapes from local minima, which is an alternative mechanism to commonly known noisy escapes due to stochastic gradients. I will also mention another example, on an implicit regularization effect of large learning rates which is to favor flatter minimizers.  Part 3 (AI for sciences) will be on data-driven prediction of mechanical dynamics, for which I will demonstrate one strong benefit of having physics hard-wired into deep learning models; more precisely, how to make symplectic predictions, and how that generically improves the accuracy of long-time predictions.