Latent neural dynamics for fast data assimilation with sparse observations

Series
Applied and Computational Mathematics Seminar
Time
Monday, March 31, 2025 - 2:00pm for 1 hour (actually 50 minutes)
Location
Skiles 005 and https://gatech.zoom.us/j/94954654170
Speaker
Peng Chen – Georgia Tech CSE
Organizer
Wei Zhu

Data assimilation techniques are crucial for correcting trajectories when modeling complex dynamical systems. The Latent Ensemble Score Filter (Latent-EnSF), our recently developed data assimilation method, has shown great promise in high-dimensional and nonlinear data assimilation problems with sparse observations. However, this method faces the challenge of high computational cost due to the expensive forward simulation. In this talk, we present Latent Dynamics EnSF (LD-EnSF), a novel methodology that evolves the neural dynamics in a low-dimensional latent space and significantly accelerates the data assimilation process.

 

To achieve this, we introduce a novel variant of Latent Dynamics Networks (LDNets) to effectively capture the system's dynamics within a low-dimensional latent space. Additionally, we propose a new method for encoding sparse observations into the latent space using recurrent neural networks. We demonstrate the robustness, accuracy, and efficiency of the proposed methods and their limitations for complex dynamical systems with highly sparse (in both space and time) and noisy observations, including shallow water wave propagation for tsunami modeling, FourCastNet in numerical weather prediction, and Kolmogorov flow that exhibits chaotic and turbulent phenomena.