Leveraging low-dimensional structures in structure-preserving machine learning for dynamical systems

Series
Applied and Computational Mathematics Seminar
Time
Monday, December 9, 2024 - 2:00pm for 1 hour (actually 50 minutes)
Location
Skiles 005 and https://gatech.zoom.us/j/94954654170
Speaker
Qi Tang – Georgia Tech CSE – qtang@gatech.eduhttps://tangqi.github.io/
Organizer
Wei Zhu

In this talk I will discuss our recent effort to develop structure-preserving machine learning (ML) for time series data, focusing on both dissipative PDEs and singularly perturbed ODEs. The first part presents a data-driven modeling method that accurately captures shocks and chaotic dynamics through a stabilized neural ODE framework. We learn the right-hand-side of an ODE by adding the outputs of two networks together, one learning a linear term and the other a nonlinear term. The architecture is inspired by the inertial manifold theorem. We apply this method to chaotic trajectories of the Kuramoto-Sivashinsky equation, where our model keeps long-term trajectories on the attractor and remains robust to noisy initial conditions. The second part explores structure-preserving ML for singularly perturbed dynamical systems. A powerful tool to address these systems is the Fenichel normal form, which significantly simplifies fast dynamics near slow manifolds. I will discuss a novel realization of this concept using ML. Specifically, a fast-slow neural network (FSNN) is proposed, enforcing the existence of a trainable, attractive invariant slow manifold as a hard constraint. To illustrate the power of FSNN, I will show a fusion-motivated example where traditional numerical integrators all fail.