Theoretical guarantees of machine learning methods for statistical sampling and PDEs in high dimensions

Series
Applied and Computational Mathematics Seminar
Time
Monday, November 2, 2020 - 4:00pm for 1 hour (actually 50 minutes)
Location
https://bluejeans.com/884917410
Speaker
Yulong Lu – University of Massachusetts Amherst – lu@math.umass.eduhttps://sites.google.com/site/yulongmath/
Organizer
Wenjing Liao

Neural network-based machine learning methods, inlcuding the most notably deep learning have achieved extraordinary successes in numerious  fields. In spite of the rapid development of learning algorithms based on neural networks, their mathematical analysis are far from understood. In particular, it has been a big mystery that neural network-based machine learning methods work extremely well for solving high dimensional problems.

In this talk, I will demonstrate the power of  neural network methods for solving two classes of high dimensional problems: statistical sampling and PDEs. In the first part of the talk, I will present a universal approximation theorem of deep neural networks for representing high dimensional probability distributions. In the second part of the talk, I will discuss a generalization error bound of the Deep Ritz Method for solving high dimensional elliptic problems. For both problems,  our theoretical results show that neural networks-based methods  can overcome the curse of dimensionality.