Seminars and Colloquia by Series

Staircases and cuspidal curves in symplectic four manifolds

Series
School of Mathematics Colloquium
Time
Friday, December 8, 2023 - 16:00 for 1 hour (actually 50 minutes)
Location
Skiles 006
Speaker
Dusa McDuffBarnard College, Columbia

Please Note: This colloquium will also be the staring talk for the 2023 Tech Topology Conference.

This talk will give an elementary introduction to my joint work with Kyler Siegel that shows how cuspidal curves in a symplectic manifold X such as the complex projective plane determine when an ellipsoid can be symplectically embedded into X.

Machine learning, optimization, & sampling through a geometric lens

Series
School of Mathematics Colloquium
Time
Monday, November 20, 2023 - 14:00 for 1 hour (actually 50 minutes)
Location
Skiles 005 and https://gatech.zoom.us/j/98355006347
Speaker
Suvrit SraMIT & TU Munich

Please Note: Joint {School of Math Colloquium} and {Applied & Computational Math Seminar}. Note: *special time*. Speaker will present in person.

Geometry arises in myriad ways within machine learning and related areas. I this talk I will focus on settings where geometry helps us understand problems in machine learning, optimization, and sampling. For instance, when sampling from densities supported on a manifold, understanding geometry and the impact of curvature are crucial; surprisingly, progress on geometric sampling theory helps us understand certain generalization properties of SGD for deep-learning! Another fascinating viewpoint afforded by geometry is in non-convex optimization: geometry can either help us make training algorithms more practical (e.g., in deep learning), it can reveal tractability despite non-convexity (e.g., via geodesically convex optimization), or it can simply help us understand existing methods better (e.g., SGD, eigenvector computation, etc.).

Ultimately, I hope to offer the audience some insights into geometric thinking and share with them some new tools that help us design, understand, and analyze models and algorithms. To make the discussion concrete I will recall a few foundational results arising from our research, provide several examples, and note some open problems.

––
Bio: Suvrit Sra is a Alexander von Humboldt Professor of Artificial Intelligence at the Technical University of Munich (Germany), and and Associate Professor of EECS at MIT (USA), where he is also a member of the Laboratory for Information and Decision Systems (LIDS) and of the Institute for Data, Systems, and Society (IDSS). He obtained his PhD in Computer Science from the University of Texas at Austin. Before TUM & MIT, he was a Senior Research Scientist at the Max Planck Institute for Intelligent Systems, Tübingen, Germany. He has held visiting positions at UC Berkeley (EECS) and Carnegie Mellon University (Machine Learning Department) during 2013-2014. His research bridges mathematical topics such as differential geometry, matrix analysis, convex analysis, probability theory, and optimization with machine learning. He founded the OPT (Optimization for Machine Learning) series of workshops, held from OPT2008–2017 at the NeurIPS  conference. He has co-edited a book with the same name (MIT Press, 2011). He is also a co-founder and chief scientist of Pendulum, a global AI+logistics startup.

 

Exploiting low-dimensional data structures in deep learning

Series
School of Mathematics Colloquium
Time
Thursday, November 2, 2023 - 11:00 for 1 hour (actually 50 minutes)
Location
Skiles 005
Speaker
Wenjing LiaoGeorgia Tech

In the past decade, deep learning has made astonishing breakthroughs in various real-world applications. It is a common belief that deep neural networks are good at learning various geometric structures hidden in data sets, such as rich local regularities, global symmetries, or repetitive patterns. One of the central interests in deep learning theory is to understand why deep neural networks are successful, and how they utilize low-dimensional data structures. In this talk, I will present some statistical learning theory of deep neural networks where data exhibit low-dimensional structures, such as lying on a low-dimensional manifold. The learning tasks include regression, classification, feature representation and operator learning. When data are sampled on a low-dimensional manifold, the sample complexity crucially depends on the intrinsic dimension of the manifold instead of the ambient dimension of the data. These results demonstrate that deep neural networks are adaptive to low-dimensional geometric structures of data sets.

Incidence estimates for tubes

Series
School of Mathematics Colloquium
Time
Thursday, October 12, 2023 - 11:00 for 1 hour (actually 50 minutes)
Location
Skiles 005
Speaker
Hong WangNYU, Courant Insitute

Let P be a set of points and L be a set of lines in the plane, what can we say about the number of incidences between P and L,    I(P, L):= |\{ (p, l)\in P\times L, p\in L\}| ?

 

The problem changes drastically when we consider a thickening version, i.e. when P is a set of unit balls and L is a set of tubes of radius 1. Furstenberg set conjecture can be viewed as an incidence problem for tubes. It states that a set containing an s-dim subset of a line in every direction should have dimension at least  (3s+1)/2 when s>0. 

 

We will survey a sequence of results by Orponen, Shmerkin and a recent joint work with Ren that leads to the solution of this conjecture

Egyptian fractions: problems and progress

Series
School of Mathematics Colloquium
Time
Thursday, April 27, 2023 - 11:00 for 1 hour (actually 50 minutes)
Location
Skiles 006
Speaker
Thomas BloomUniversity of Oxford

The study of Egyptian fractions, representing rational numbers as the sum of distinct unit fractions, is one of the oldest areas of number theory. In this talk we will discuss some fascinating problems in the area, including both open problems and some recent progress, such as the solution to the Erdos-Graham conjecture: 1 can be written as the sum of unit fractions with denominators drawn from an arbitrary set of integers of positive density.

Geometry and dynamics of compressible fluids

Series
School of Mathematics Colloquium
Time
Thursday, March 2, 2023 - 11:00 for 1 hour (actually 50 minutes)
Location
Skiles 006
Speaker
Boris KhesinUniversity of Toronto

Please Note: Live-stream link: https://gatech.zoom.us/j/93100501365?pwd=bWFEeURxek5pWG1BRjN4MHcvYllYQT09 Passcode provided in talk announcement

We describe a geometric framework to study Newton's
equations on infinite-dimensional configuration spaces of
diffeomorphisms and smooth probability densities. It turns out that
several important PDEs of hydrodynamical origin can be described in
this framework in a natural way. In particular, the so-called Madelung
transform between the Schrödinger-type equations on wave functions and
Newton's equations on densities turns out to be a Kähler map between
the corresponding phase spaces, equipped with the Fubini-Study and
Fisher-Rao information metrics. This is a joint work with G.Misiolek
and K.Modin.

From Artin monoids to Artin groups

Series
School of Mathematics Colloquium
Time
Friday, December 9, 2022 - 16:00 for 1 hour (actually 50 minutes)
Location
Skiles 006
Speaker
Ruth CharneyBrandeis University

Braid groups belong to a broad class of groups known as Artin groups, which are defined by presentations of a particular form and have played a major role in geometric group theory and low-dimensional topology in recent years. These groups fall into two classes, finite-type and infinte-type Artin groups. The former come equipped with a powerful combinatorial structure, known as a Garside structure, while the latter are much less understood and present many challenges. However, if one restricts to the Artin monoid, then much of the combinatorial structure still applies in the infinite-type case. In a joint project with Rachael Boyd, Rose Morris-Wright, and Sarah Rees, we use geometric techniques to study the relation between the Artin monoid and the Artin group.

Learning to Solve Hard Minimal Problems

Series
School of Mathematics Colloquium
Time
Thursday, October 13, 2022 - 11:00 for 1 hour (actually 50 minutes)
Location
Skiles 006
Speaker
Anton LeykinGeorgia Tech

The main result in this talk concerns a new fast algorithm to solve a minimal problem with many spurious solutions that arises as a relaxation of a geometric optimization problem. The algorithm recovers relative camera pose from points and lines in multiple views. Solvers like this are the backbone of structure-from-motion techniques that estimate 3D structures from 2D image sequences.  

Our methodology is general and applicable in areas other than computer vision. The ingredients come from algebra, geometry, numerical methods, and applied statistics. Our fast implementation relies on a homotopy continuation optimized for our setting and a machine-learned neural network.

(This covers joint works with Tim Duff, Ricardo Fabbri, Petr Hruby, Kathlen Kohn, Tomas Pajdla, and others.

The talk is suitable for both professors and students.)

Pages