### How to Break the Curse of Dimensionality

- Series
- Applied and Computational Mathematics Seminar
- Time
- Monday, January 31, 2022 - 14:00 for 1 hour (actually 50 minutes)
- Location
- https://bluejeans.com/457724603/4379
- Speaker
- Ming-Jun Lai – University of Georgia

We first review the problem of the curse of dimensionality when approximating multi-dimensional functions. Several approximation results from Barron, Petrushev, Bach, and etc . will be explained.

Then we present two approaches to break the curse of the dimensionality: one is based on probability approach explained in Barron, 1993 and the other one is based on a deterministic approach using the Kolmogorov superposition theorem. As the Kolmogorov superposition theorem has been used to explain the approximation of neural network computation, I will use it to explain why the deep learning algorithm works for image classification.

In addition, I will introduce the neural network approximation based on higher order ReLU functions to explain the powerful approximation of multivariate functions using deep learning algorithms with multiple layers.