Seminars and Colloquia Schedule

CANCELLED

Series
Algebra Seminar
Time
Monday, November 24, 2025 - 13:00 for 1 hour (actually 50 minutes)
Location
Skiles 005
Speaker
Harold BlumGeorgia Tech

Bordered contact invariants and half Giroux torsion

Series
Geometry Topology Seminar
Time
Monday, November 24, 2025 - 14:00 for 1 hour (actually 50 minutes)
Location
Skiles 006
Speaker
Konstantinos VarvarezosUGA

Giroux torsion is an important class of contact structures on a neighborhood of a torus, which is known to obstruct symplectic fillability. Ghiggini conjectured that half Giroux torsion along a separating torus always results in a vanishing Heegaard Floer contact invariant hence also obstructs fillability. In this talk, we present a counterexample to that conjecture. Our main tool is a bordered contact invariant, which enables efficient computation of the contact invariant.

Transformers for Learning a Single task and Multi Task Regression on Manifolds: Approximation and Generalization Insights

Series
Applied and Computational Mathematics Seminar
Time
Monday, November 24, 2025 - 14:00 for 1 hour (actually 50 minutes)
Location
Skiles 005 and https://gatech.zoom.us/j/94954654170
Speaker
Zhaiming ShenGeorgia Institute of Technology

Transformers serve as the foundational architecture for large language and video generation models, such as GPT, BERT, SORA, and their successors. While empirical studies have shown that real-world data and learning tasks exhibit low-dimensional geometric structures, the theoretical understanding of transformers in leveraging these structures remains largely unexplored. In this talk, we present a theoretical foundation for transformers in two key scenarios: (1) regression tasks with noisy input data lying near a low-dimensional manifold, and (2) in-context learning (ICL) for regression of Hölder functions on manifolds. For the first setting, we prove that approximation and generalization bound that depend crucially on the intrinsic dimension of the manifold, demonstrating that transformers can effectively learn from data perturbed by high-dimensional noise. For the second setting, we derive generalization error bounds for ICL in terms of prompt length and the number of training tasks, revealing that transformers achieve the minimax optimal rate for Hölder regression—scaling exponentially with the intrinsic rather than ambient dimension. Together, these results provide foundational insights into how transformers exploit low-dimensional geometric structures in learning tasks, advancing our theoretical understanding of their remarkable empirical success.