Seminars and Colloquia by Series

Opportunities and Challenges of Neural Networks in Partial Differential Equations

Series
Applied and Computational Mathematics Seminar
Time
Monday, December 1, 2025 - 14:00 for 1 hour (actually 50 minutes)
Location
Skiles 005 and https://gatech.zoom.us/j/94954654170
Speaker
Yahong YangGeorgia Tech

The use of neural networks for solving partial differential equations (PDEs) has attracted considerable attention in recent years. In this talk, I will first highlight their advantages over traditional numerical methods, including improved approximation rates and the potential to overcome the curse of dimensionality. I will then discuss the challenges that arise when applying neural networks to PDEs, particularly in training. Because training is inherently a highly nonconvex optimization problem, it can lead to poor local minima with large training errors, especially in complex PDE settings. To address these issues, I will demonstrate how incorporating mathematical insight into the design of training algorithms and network architectures can lead to significant improvements in both accuracy and robustness.

Lorentzian Polynomials for Simplicial Complexes

Series
Algebra Seminar
Time
Monday, December 1, 2025 - 13:00 for 1 hour (actually 50 minutes)
Location
Skiles 005
Speaker
Jonathan LeakeUniversity of Waterloo

Please Note: There will be a pre-seminar 10:55-11:25 in Skiles 005.

In recent years, the theories of Lorentzian polynomials and combinatorial Hodge theory have been developed and utilized to resolve long-standing conjectures in matroid theory, related to log-concavity inequalities and sampling algorithms. The overarching idea in these theories is to extract the conjectured results from basic eigenvalue bounds on certain natural matrices associated to matroids. Since then, Lorentzian polynomials have been generalized beyond matroids to simplicial complexes of various types, implying old and new results on various combinatorial structures such as linear extensions of posets. That said, many questions remain open. In this talk, we will describe this generalized theory and discuss how it can be used to prove various combinatorial results. No knowledge of matroid theory will be assumed. Joint work with Kasper Lindberg and Shayan Oveis Gharan, and also with Petter Brändén.

Transformers for Learning a Single task and Multi Task Regression on Manifolds: Approximation and Generalization Insights

Series
Applied and Computational Mathematics Seminar
Time
Monday, November 24, 2025 - 14:00 for 1 hour (actually 50 minutes)
Location
Skiles 005 and https://gatech.zoom.us/j/94954654170
Speaker
Zhaiming ShenGeorgia Institute of Technology

Transformers serve as the foundational architecture for large language and video generation models, such as GPT, BERT, SORA, and their successors. While empirical studies have shown that real-world data and learning tasks exhibit low-dimensional geometric structures, the theoretical understanding of transformers in leveraging these structures remains largely unexplored. In this talk, we present a theoretical foundation for transformers in two key scenarios: (1) regression tasks with noisy input data lying near a low-dimensional manifold, and (2) in-context learning (ICL) for regression of Hölder functions on manifolds. For the first setting, we prove that approximation and generalization bound that depend crucially on the intrinsic dimension of the manifold, demonstrating that transformers can effectively learn from data perturbed by high-dimensional noise. For the second setting, we derive generalization error bounds for ICL in terms of prompt length and the number of training tasks, revealing that transformers achieve the minimax optimal rate for Hölder regression—scaling exponentially with the intrinsic rather than ambient dimension. Together, these results provide foundational insights into how transformers exploit low-dimensional geometric structures in learning tasks, advancing our theoretical understanding of their remarkable empirical success.

Bordered contact invariants and half Giroux torsion

Series
Geometry Topology Seminar
Time
Monday, November 24, 2025 - 14:00 for 1 hour (actually 50 minutes)
Location
Skiles 006
Speaker
Konstantinos VarvarezosUGA

Giroux torsion is an important class of contact structures on a neighborhood of a torus, which is known to obstruct symplectic fillability. Ghiggini conjectured that half Giroux torsion along a separating torus always results in a vanishing Heegaard Floer contact invariant hence also obstructs fillability. In this talk, we present a counterexample to that conjecture. Our main tool is a bordered contact invariant, which enables efficient computation of the contact invariant.

CANCELLED

Series
Algebra Seminar
Time
Monday, November 24, 2025 - 13:00 for 1 hour (actually 50 minutes)
Location
Skiles 005
Speaker
Harold BlumGeorgia Tech

Longest Common (and Increasing) Subsequences in Random Words: Differences and Similarities

Series
Combinatorics Seminar
Time
Friday, November 21, 2025 - 15:15 for 1 hour (actually 50 minutes)
Location
Skiles 005
Speaker
Christian HoudreGeorgia Institute of Technology

Let $LC_n$ be the length of the longest common subsequences of two independent random words whose letters are taken  

in a finite alphabet and when the alphabet is totally ordered, let $LCI_n$ be the length of the longest common and increasing subsequences of the words.   Results on the asymptotic means, variances and limiting laws of these well known random objects will be described and compared.  

Introduction to Teichmuller theory, classical and higher rank

Series
Geometry Topology Working Seminar
Time
Friday, November 21, 2025 - 14:00 for 2 hours
Location
Skiles 006
Speaker
Mike WolfGeorgia Tech

We give a breezy overview of Teichmuller theory, the deformation theory of Riemann surfaces. The richness of the subject comes from all the perspectives one can take on Riemann surfaces: complex analytic for sure, but also Riemannian, topological, dynamical and algebraic.  In the past 40 years or so, interest has erupted in an extension of Teichmuller theory, here thought of as a component of the character variety of surface group representations into PSL(2,\R), to the study of the character variety of surface group representations into higher rank Lie groups (e.g. SL(n, \R)). We give an even breezier discussion of that.  The whole point will be to gauge interest in topics for a followup lecture series in the spring.

Precise Error Rates for Computationally Efficient Testing

Series
Stochastics Seminar
Time
Thursday, November 20, 2025 - 15:30 for 1 hour (actually 50 minutes)
Location
Skiles 006
Speaker
Alex WeinUC Davis

We consider one of the most basic high-dimensional testing problems: that of detecting the presence of a rank-1 "spike" in a random Gaussian (GOE) matrix. When the spike has structure such as sparsity, inherent statistical-computational tradeoffs are expected. I will discuss some precise results about the computational complexity, arguing that the so-called "linear spectral statistics" achieve the best possible tradeoff between type I & II errors among all polynomial-time algorithms, even though an exponential-time algorithm can do better. This is based on https://arxiv.org/abs/2311.00289 with Ankur Moitra which uses a version of the low-degree polynomial heuristic, as well as forthcoming work with Ansh Nagda which gives a stronger form of reduction-based hardness.

Modular Framework for Solving Nonlinear Algebra Problems

Series
Dissertation Defense
Time
Thursday, November 20, 2025 - 11:00 for 1 hour (actually 50 minutes)
Location
Skiles 006
Speaker
Hannah MahonGeorgia Institute of Technology

Please Note: Virtual link: https://gtri.webex.com/gtri/j.php?MTID=m011cc2568fe8370921b1458aa0d5a96c

This thesis introduces a modular framework written in Macaulay2 designed to solve nonlinear algebra problems.  First, we will introduce the background for the framework, covering gates, circuits, and straight-line programs, and then we will define the gates used in the framework.  The remainder of the talk will include well-known algorithms such as Newton's method and Runge-Kutta for solving nonlinear algebra problems, their implementation in the framework, and explicit conic problems with a comparison between different methods.

Pages