Dynamic Stability in Stochastic Gradient Descent
- Series
- CDSNS Colloquium
- Time
- Friday, May 24, 2024 - 15:30 for
- Location
- Skiles 254
- Speaker
- Dennis Chemnitz – FU Berlin – dennis@zedat.fu-berlin.de
Streaming via Zoom: https://gatech.zoom.us/j/91390791493?pwd=QnpaWHNEOHZTVXlZSXFkYTJ0b0Q0UT0... />
Most modern machine learning applications are based on overparameterized neural networks trained by variants of stochastic gradient descent. To explain the performance of these networks from a theoretical perspective (in particular the so-called "implicit bias"), it is necessary to understand the random dynamics of the optimization algorithms. Mathematically this amounts to the study of random dynamical systems with manifolds of equilibria. In this talk, I will give a brief introduction to machine learning theory and explain how almost-sure Lyapunov exponents and moment Lyapunov exponents can be used to characterize the set of possible limit points for stochastic gradient descent.