Insights on gradient-based algorithms in high-dimensional non-convex learning

Series
School of Mathematics Colloquium
Time
Thursday, November 12, 2020 - 11:00am for 1 hour (actually 50 minutes)
Location
https://us02web.zoom.us/j/89107379948
Speaker
Lenka Zdeborová – EPFL
Organizer
Cheng Mao

Gradient descent algorithms and their noisy variants, such as the Langevin dynamics or multi-pass SGD, are at the center of attention in machine learning. Yet their behaviour remains perplexing, in particular in the high-dimensional non-convex setting. In this talk, I will present several high-dimensional and non-convex statistical learning problems in which the performance of gradient-based algorithms can be analysed down to a constant. The common point of these settings is that the data come from a probabilistic generative model leading to problems for which, in the high-dimensional limit, statistical physics provides exact closed solutions for the performance of the gradient-based algorithms. The covered settings include the spiked mixed matrix-tensor model and the phase retrieval.