Monday, November 24, 2025 - 14:00 for 1 hour (actually 50 minutes)
Location
Skiles 006
Speaker
Konstantinos Varvarezos – UGA
Giroux torsion is an important class of contact structures on a neighborhood of a torus, which is known to obstruct symplectic fillability. Ghiggini conjectured that half Giroux torsion along a separating torus always results in a vanishing Heegaard Floer contact invariant hence also obstructs fillability. In this talk, we present a counterexample to that conjecture. Our main tool is a bordered contact invariant, which enables efficient computation of the contact invariant.
Transformers serve as the foundational architecture for large language and video generation models, such as GPT, BERT, SORA, and their successors. While empirical studies have shown that real-world data and learning tasks exhibit low-dimensional geometric structures, the theoretical understanding of transformers in leveraging these structures remains largely unexplored. In this talk, we present a theoretical foundation for transformers in two key scenarios: (1) regression tasks with noisy input data lying near a low-dimensional manifold, and (2) in-context learning (ICL) for regression of Hölder functions on manifolds. For the first setting, we prove that approximation and generalization bound that depend crucially on the intrinsic dimension of the manifold, demonstrating that transformers can effectively learn from data perturbed by high-dimensional noise. For the second setting, we derive generalization error bounds for ICL in terms of prompt length and the number of training tasks, revealing that transformers achieve the minimax optimal rate for Hölder regression—scaling exponentially with the intrinsic rather than ambient dimension. Together, these results provide foundational insights into how transformers exploit low-dimensional geometric structures in learning tasks, advancing our theoretical understanding of their remarkable empirical success.