- Applied and Computational Mathematics Seminar
- Monday, March 14, 2022 - 2:00pm for 1 hour (actually 50 minutes)
- Zhihui Zhu – University of Denvor – email@example.com – http://mysite.du.edu//~zzhu61/index.html
- Wenjing Liao
In the past decade, the revival of deep neural networks has led to dramatic success in numerous applications ranging from computer vision to natural language processing to scientific discovery and beyond. Nevertheless, the practice of deep networks has been shrouded with mystery as our theoretical understanding of the success of deep learning remains elusive.
In this talk, we will exploit low-dimensional modeling to help understand and improve deep learning performance. We will first provide a geometric analysis for understanding neural collapse, an intriguing empirical phenomenon that persists across different neural network architectures and a variety of standard datasets. We will utilize our understanding of neural collapse to improve training efficiency. We will then exploit principled methods for dealing with sparsity and sparse corruptions to address the challenges of overfitting for modern deep networks in the presence of training data corruptions. We will introduce a principled approach for robustly training deep networks with noisy labels and robustly recovering natural images by deep image prior.