Seminars and Colloquia by Series

Critical phenomena through the lens of the Ising model

Series
School of Mathematics Colloquium
Time
Friday, March 8, 2024 - 11:00 for 1 hour (actually 50 minutes)
Location
Speaker
Hugo Duminil-CopinIHES and Université de Genève

The Ising model is one of the most classical lattice models of statistical physics undergoing a phase transition. Initially imagined as a model for ferromagnetism, it revealed itself as a very rich mathematical object and a powerful theoretical tool to understand cooperative phenomena. Over one hundred years of its history, a profound understanding of its critical phase has been obtained. While integrability and mean-field behavior led to extraordinary breakthroughs in the two-dimensional and high-dimensional cases respectively, the model in three and four dimensions remained mysterious for years. In this talk, we will present recent progress in these dimensions based on a probabilistic interpretation of the Ising model relating it to percolation models.

Bilipschitz invariants

Series
School of Mathematics Colloquium
Time
Thursday, February 15, 2024 - 11:00 for 1 hour (actually 50 minutes)
Location
Skiles 005
Speaker
Dustin MixonOhio State University

Motivated by problems in data science, we study the following questions:

(1) Given a Hilbert space V and a group G of linear isometries, does there exist a bilipschitz embedding of the quotient metric space V/G into a Hilbert space?

(2) What are necessary and sufficient conditions for such embeddings?

(3) Which embeddings minimally distort the metric?

We answer these questions in a variety of settings, and we conclude with several open problems.

Staircases and cuspidal curves in symplectic four manifolds

Series
School of Mathematics Colloquium
Time
Friday, December 8, 2023 - 16:00 for 1 hour (actually 50 minutes)
Location
Skiles 006
Speaker
Dusa McDuffBarnard College, Columbia

Please Note: This colloquium will also be the staring talk for the 2023 Tech Topology Conference.

This talk will give an elementary introduction to my joint work with Kyler Siegel that shows how cuspidal curves in a symplectic manifold X such as the complex projective plane determine when an ellipsoid can be symplectically embedded into X.

Machine learning, optimization, & sampling through a geometric lens

Series
School of Mathematics Colloquium
Time
Monday, November 20, 2023 - 14:00 for 1 hour (actually 50 minutes)
Location
Skiles 005 and https://gatech.zoom.us/j/98355006347
Speaker
Suvrit SraMIT & TU Munich

Please Note: Joint {School of Math Colloquium} and {Applied & Computational Math Seminar}. Note: *special time*. Speaker will present in person.

Geometry arises in myriad ways within machine learning and related areas. I this talk I will focus on settings where geometry helps us understand problems in machine learning, optimization, and sampling. For instance, when sampling from densities supported on a manifold, understanding geometry and the impact of curvature are crucial; surprisingly, progress on geometric sampling theory helps us understand certain generalization properties of SGD for deep-learning! Another fascinating viewpoint afforded by geometry is in non-convex optimization: geometry can either help us make training algorithms more practical (e.g., in deep learning), it can reveal tractability despite non-convexity (e.g., via geodesically convex optimization), or it can simply help us understand existing methods better (e.g., SGD, eigenvector computation, etc.).

Ultimately, I hope to offer the audience some insights into geometric thinking and share with them some new tools that help us design, understand, and analyze models and algorithms. To make the discussion concrete I will recall a few foundational results arising from our research, provide several examples, and note some open problems.

––
Bio: Suvrit Sra is a Alexander von Humboldt Professor of Artificial Intelligence at the Technical University of Munich (Germany), and and Associate Professor of EECS at MIT (USA), where he is also a member of the Laboratory for Information and Decision Systems (LIDS) and of the Institute for Data, Systems, and Society (IDSS). He obtained his PhD in Computer Science from the University of Texas at Austin. Before TUM & MIT, he was a Senior Research Scientist at the Max Planck Institute for Intelligent Systems, Tübingen, Germany. He has held visiting positions at UC Berkeley (EECS) and Carnegie Mellon University (Machine Learning Department) during 2013-2014. His research bridges mathematical topics such as differential geometry, matrix analysis, convex analysis, probability theory, and optimization with machine learning. He founded the OPT (Optimization for Machine Learning) series of workshops, held from OPT2008–2017 at the NeurIPS  conference. He has co-edited a book with the same name (MIT Press, 2011). He is also a co-founder and chief scientist of Pendulum, a global AI+logistics startup.

 

Exploiting low-dimensional data structures in deep learning

Series
School of Mathematics Colloquium
Time
Thursday, November 2, 2023 - 11:00 for 1 hour (actually 50 minutes)
Location
Skiles 005
Speaker
Wenjing LiaoGeorgia Tech

In the past decade, deep learning has made astonishing breakthroughs in various real-world applications. It is a common belief that deep neural networks are good at learning various geometric structures hidden in data sets, such as rich local regularities, global symmetries, or repetitive patterns. One of the central interests in deep learning theory is to understand why deep neural networks are successful, and how they utilize low-dimensional data structures. In this talk, I will present some statistical learning theory of deep neural networks where data exhibit low-dimensional structures, such as lying on a low-dimensional manifold. The learning tasks include regression, classification, feature representation and operator learning. When data are sampled on a low-dimensional manifold, the sample complexity crucially depends on the intrinsic dimension of the manifold instead of the ambient dimension of the data. These results demonstrate that deep neural networks are adaptive to low-dimensional geometric structures of data sets.

Incidence estimates for tubes

Series
School of Mathematics Colloquium
Time
Thursday, October 12, 2023 - 11:00 for 1 hour (actually 50 minutes)
Location
Skiles 005
Speaker
Hong WangNYU, Courant Insitute

Let P be a set of points and L be a set of lines in the plane, what can we say about the number of incidences between P and L,    I(P, L):= |\{ (p, l)\in P\times L, p\in L\}| ?

 

The problem changes drastically when we consider a thickening version, i.e. when P is a set of unit balls and L is a set of tubes of radius 1. Furstenberg set conjecture can be viewed as an incidence problem for tubes. It states that a set containing an s-dim subset of a line in every direction should have dimension at least  (3s+1)/2 when s>0. 

 

We will survey a sequence of results by Orponen, Shmerkin and a recent joint work with Ren that leads to the solution of this conjecture

Egyptian fractions: problems and progress

Series
School of Mathematics Colloquium
Time
Thursday, April 27, 2023 - 11:00 for 1 hour (actually 50 minutes)
Location
Skiles 006
Speaker
Thomas BloomUniversity of Oxford

The study of Egyptian fractions, representing rational numbers as the sum of distinct unit fractions, is one of the oldest areas of number theory. In this talk we will discuss some fascinating problems in the area, including both open problems and some recent progress, such as the solution to the Erdos-Graham conjecture: 1 can be written as the sum of unit fractions with denominators drawn from an arbitrary set of integers of positive density.

Pages