Applied differential geometry and harmonic analysis in deep learning regularization
- Series
- Applied and Computational Mathematics Seminar
- Time
- Monday, September 23, 2019 - 13:50 for 1 hour (actually 50 minutes)
- Location
- Skiles 005
- Speaker
- Wei Zhu – Duke University – zhu@math.duke.edu
Deep neural networks (DNNs) have revolutionized machine learning by gradually replacing the traditional model-based algorithms with data-driven methods. While DNNs have proved very successful when large training sets are available, they typically have two shortcomings: First, when the training data are scarce, DNNs tend to suffer from overfitting. Second, the generalization ability of overparameterized DNNs still remains a mystery. In this talk, I will discuss two recent works to “inject” the “modeling” flavor back into deep learning to improve the generalization performance and interpretability of the DNN model. This is accomplished by DNN regularization through applied differential geometry and harmonic analysis. In the first part of the talk, I will explain how to improve the regularity of the DNN representation by enforcing a low-dimensionality constraint on the data-feature concatenation manifold. In the second part, I will discuss how to impose scale-equivariance in network representation by conducting joint convolutions across the space and the scaling group. The stability of the equivariant representation to nuisance input deformation is also proved under mild assumptions on the Fourier-Bessel norm of filter expansion coefficients.