Seminars and Colloquia Schedule

TBD by Ian Tan

Series
Algebra Seminar
Time
Monday, January 27, 2025 - 13:00 for 1 hour (actually 50 minutes)
Location
Skiles 005
Speaker
Ian TanAuburn University

From centralized to federated learning of neural operators: Accuracy, efficiency, and reliability

Series
Applied and Computational Mathematics Seminar
Time
Monday, January 27, 2025 - 14:00 for 1 hour (actually 50 minutes)
Location
Skiles 005 and https://gatech.zoom.us/j/94954654170
Speaker
Lu LuYale University

As an emerging paradigm in scientific machine learning, deep neural operators pioneered by us can learn nonlinear operators of complex dynamic systems via neural networks. In this talk, I will present the deep operator network (DeepONet) to learn various operators that represent deterministic and stochastic differential equations. I will also present several extensions of DeepONet, such as DeepM&Mnet for multiphysics problems, DeepONet with proper orthogonal decomposition or Fourier decoder layers, MIONet for multiple-input operators, and multifidelity DeepONet. I will demonstrate the effectiveness of DeepONet and its extensions to diverse multiphysics and multiscale problems, such as bubble growth dynamics, high-speed boundary layers, electroconvection, hypersonics, geological carbon sequestration, full waveform inversion, and astrophysics. Deep learning models are usually limited to interpolation scenarios, and I will quantify the extrapolation complexity and develop a complete workflow to address the challenge of extrapolation for deep neural operators. Moreover, I will present the first operator learning method that only requires one PDE solution, i.e., one-shot learning, by introducing a new concept of local solution operator based on the principle of locality of PDEs. I will also present the first systematic study of federated scientific machine learning (FedSciML) for approximating functions and solving PDEs with data heterogeneity.

Recent progress on the horocycle flow on strata of translation surfaces - NEW DATE

Series
Job Candidate Talk
Time
Tuesday, January 28, 2025 - 11:00 for 1 hour (actually 50 minutes)
Location
Skiles 006
Speaker
Jon ChaikaUniversity of Utah

For about 2 decades the horocycle flow on strata of translation surfaces was studied, very successfully, in analogy with unipotent flows on homogeneous spaces, which by work of Ratner, Margulis, Dani and many others, have striking rigidity properties. In the past decade Eskin-Mirzakhani and Eskin-Mirzakhani-Mohammadi proved some analogous rigidity results for SL(2,R) and the full upper triangular subgroup on strata of translation surfaces. This talk will begin by introducing ergodic theory and translation surfaces. Then it will describe some of the previously mentioned rigidity theorems before moving on to its goal, that many such rigidity results fail for the horocycle flow on strata of translation surfaces. Time permitting we will also describe a rigidity result for special sub-objects in strata of translation surfaces. This will include joint work with Osama Khalil, John Smillie, Barak Weiss and Florent Ygouf. 

Extreme value theory for random walks in space-time random media

Series
Job Candidate Talk
Time
Wednesday, January 29, 2025 - 11:00 for 1 hour (actually 50 minutes)
Location
Skiles 006
Speaker
Shalin ParekhUniversity of Maryland

 

The KPZ equation is a singular stochastic PDE arising as a scaling limit of various physically and probabilistically interesting models. Often, this equation describes the “crossover” between Gaussian and non-Gaussian fluctuation behavior in simple models of interacting particles, directed polymers, or interface growth. It is a difficult and elusive open problem to elucidate the nature of this crossover for general stochastic interface models. In this talk, I will discuss a series of recent works where we have made progress in understanding the KPZ crossover for models of random walks in dynamical random media. This was done through a tilting-based approach to study the extreme tails of the quenched probability distribution. This talk includes joint work with Sayan Das and Hindy Drillick.

Zoom link:

https://gatech.zoom.us/j/96535844666

Characterizing Submodules in $H^2(\mathbb{D}^2)$ Using the Core Function

Series
Time
Wednesday, January 29, 2025 - 14:00 for 1 hour (actually 50 minutes)
Location
Skiles 005
Speaker
Victor BaileyUniversity of Oklahoma

It is well known that  $H^2(\mathbb{D}^2)$ is a RKHS with the reproducing kernel $K( \lambda, z) = \frac{1}{(1-\overline{\lambda_1}z_1)(1 - \overline{\lambda_2}z_2)}$ and that for any submodule $M \subseteq H^2(\mathbb{D}^2)$ its reproducing kernel is $K^M( \lambda, z) = P_M K( \lambda, z)$ where $P_M$ is the orthogonal projection onto $M$. Associated with any submodule $M$ are the core function $G^M( \lambda, z) = \frac{K^M( \lambda, z)}{K( \lambda, z)}$ and the core operator $C_M$, an integral transform on $H^2(\mathbb{D}^2)$ with kernel function $G^M$. The utility of these constructions for better understanding the structure of a given submodule is evident from the various works in the past 20 years. In this talk, we will discuss the relationship between the rank, codimension, etc. of a given submodule and the properties of its core function and core operator. In particular, we will discuss the longstanding open question regarding whether we can characterize all submodules whose core function is bounded. This is a joint project with Rongwei Yang.