Seminars and Colloquia Schedule

Geometry and the complexity of matrix multiplication

Series
Algebra Seminar
Time
Monday, November 20, 2023 - 13:00 for 1 hour (actually 50 minutes)
Location
Skiles 006
Speaker
Austin ConnerHarvard University

There will be a pre-seminar (aimed toward grad students and postdocs) from 11 am to 11:30 am in Skiles 006.

Determining the computational complexity of matrix multiplication has been one of the central open problems in theoretical computer science ever since in 1969
Strassen presented an algorithm for multiplication of n by n matrices requiring only O(n^2.81) arithmetic operations. The data describing this method is
equivalently an expression to write the structure tensor of the 2 by 2 matrix algebra as a sum of 7 decomposable tensors. Any such decomposition of an n by n
matrix algebra yields a Strassen type algorithm, and Strassen showed that such algorithms are general enough to determine the exponent of matrix multiplication. Bini later showed all of the above remains true when we allow the decomposition to depend on a parameter and take limits.

I present a recent technique for lower bounds for this decomposition problem, border apolarity. Two key ideas to this technique are (i) to not just look at the sequence of decompositions, but the sequence of ideals of the point sets determining the decompositions and (ii) to exploit the symmetry of the matrix
multiplication tensor to insist that the limiting ideal has an extremely restrictive structure. I discuss its applications to the matrix multiplication
tensor and other tensors potentially useful for obtaining upper bounds via Strassen's laser method. This talk discusses joint work with JM Landsberg, Alicia Harper, and Amy Huang.

Machine learning, optimization, & sampling through a geometric lens

Series
School of Mathematics Colloquium
Time
Monday, November 20, 2023 - 14:00 for 1 hour (actually 50 minutes)
Location
Skiles 005 and https://gatech.zoom.us/j/98355006347
Speaker
Suvrit SraMIT & TU Munich

Joint {School of Math Colloquium} and {Applied & Computational Math Seminar}. Note: *special time*.<br />
Speaker will present in person.<br />

Geometry arises in myriad ways within machine learning and related areas. I this talk I will focus on settings where geometry helps us understand problems in machine learning, optimization, and sampling. For instance, when sampling from densities supported on a manifold, understanding geometry and the impact of curvature are crucial; surprisingly, progress on geometric sampling theory helps us understand certain generalization properties of SGD for deep-learning! Another fascinating viewpoint afforded by geometry is in non-convex optimization: geometry can either help us make training algorithms more practical (e.g., in deep learning), it can reveal tractability despite non-convexity (e.g., via geodesically convex optimization), or it can simply help us understand existing methods better (e.g., SGD, eigenvector computation, etc.).

Ultimately, I hope to offer the audience some insights into geometric thinking and share with them some new tools that help us design, understand, and analyze models and algorithms. To make the discussion concrete I will recall a few foundational results arising from our research, provide several examples, and note some open problems.

––
Bio: Suvrit Sra is a Alexander von Humboldt Professor of Artificial Intelligence at the Technical University of Munich (Germany), and and Associate Professor of EECS at MIT (USA), where he is also a member of the Laboratory for Information and Decision Systems (LIDS) and of the Institute for Data, Systems, and Society (IDSS). He obtained his PhD in Computer Science from the University of Texas at Austin. Before TUM & MIT, he was a Senior Research Scientist at the Max Planck Institute for Intelligent Systems, Tübingen, Germany. He has held visiting positions at UC Berkeley (EECS) and Carnegie Mellon University (Machine Learning Department) during 2013-2014. His research bridges mathematical topics such as differential geometry, matrix analysis, convex analysis, probability theory, and optimization with machine learning. He founded the OPT (Optimization for Machine Learning) series of workshops, held from OPT2008–2017 at the NeurIPS  conference. He has co-edited a book with the same name (MIT Press, 2011). He is also a co-founder and chief scientist of Pendulum, a global AI+logistics startup.

 

Machine learning, optimization, & sampling through a geometric lens

Series
Applied and Computational Mathematics Seminar
Time
Monday, November 20, 2023 - 14:00 for 1 hour (actually 50 minutes)
Location
Skiles 005 and https://gatech.zoom.us/j/98355006347
Speaker
Suvrit SraMIT &amp; TU Munich

Joint {Applied & Computational Math Seminar} and {School of Math Colloquium}.<br />
Speaker will present in person.

Geometry arises in myriad ways within machine learning and related areas. I this talk I will focus on settings where geometry helps us understand problems in machine learning, optimization, and sampling. For instance, when sampling from densities supported on a manifold, understanding geometry and the impact of curvature are crucial; surprisingly, progress on geometric sampling theory helps us understand certain generalization properties of SGD for deep-learning! Another fascinating viewpoint afforded by geometry is in non-convex optimization: geometry can either help us make training algorithms more practical (e.g., in deep learning), it can reveal tractability despite non-convexity (e.g., via geodesically convex optimization), or it can simply help us understand existing methods better (e.g., SGD, eigenvector computation, etc.).

Ultimately, I hope to offer the audience some insights into geometric thinking and share with them some new tools that help us design, understand, and analyze models and algorithms. To make the discussion concrete I will recall a few foundational results arising from our research, provide several examples, and note some open problems.

––
Bio: Suvrit Sra is a Alexander von Humboldt Professor of Artificial Intelligence at the Technical University of Munich (Germany), and and Associate Professor of EECS at MIT (USA), where he is also a member of the Laboratory for Information and Decision Systems (LIDS) and of the Institute for Data, Systems, and Society (IDSS). He obtained his PhD in Computer Science from the University of Texas at Austin. Before TUM & MIT, he was a Senior Research Scientist at the Max Planck Institute for Intelligent Systems, Tübingen, Germany. He has held visiting positions at UC Berkeley (EECS) and Carnegie Mellon University (Machine Learning Department) during 2013-2014. His research bridges mathematical topics such as differential geometry, matrix analysis, convex analysis, probability theory, and optimization with machine learning. He founded the OPT (Optimization for Machine Learning) series of workshops, held from OPT2008–2017 at the NeurIPS  conference. He has co-edited a book with the same name (MIT Press, 2011). He is also a co-founder and chief scientist of Pendulum, a global AI+logistics startup.

 

The most likely evolution of diffusing and vanishing particles: Schrodinger Bridges with unbalanced marginals

Series
PDE Seminar
Time
Tuesday, November 21, 2023 - 15:30 for 1 hour (actually 50 minutes)
Location
Skiles 005
Speaker
Yongxin ChenGeorgia Tech

Stochastic flows of an advective-diffusive nature are ubiquitous in biology and the physical sciences. Of particular interest is the problem to reconcile observed marginal distributions with a given prior posed by E. Schroedinger in 1932/32 and known as the Schroedinger Bridge Problem (SBP). It turns out that Schroedinger’s problem can be viewed as a problem in large deviations, a modeling problem, as well as a control problem. Due to the fundamental significance of this problem, interest in SBP and in its deterministic (zero-noise limit) counterpart of Optimal Transport (OT) has in recent years enticed a broad spectrum of disciplines, including physics, stochastic control, computer science, and geometry. Yet, while the mathematics and applications of SBP/OT have been developing at a considerable pace, accounting for marginals of unequal mass has received scant attention; the problem to interpolate between “unbalanced” marginals has been approached by introducing source/sink terms into the transport equations, in an adhoc manner, chiefly driven by applications in image registration. Nevertheless, losses are inherent in many physical processes and, thereby, models that account for lossy transport may also need to be reconciled with observed marginals following Schroedinger’s dictum; that is, to adjust the probability of trajectories of particles, including those that do not make it to the terminal observation point, so that the updated law represents the most likely way that particles may have been transported, or vanished, at some intermediate point. Thus, the purpose of this talk is to present recent results on stochastic evolutions with losses, whereupon particles are “killed” (jump into a coffin/extinction state) according to a probabilistic law, and thereby mass is gradually lost along their stochastically driven flow. Through a suitable embedding we turn the problem into an SBP for stochastic processes that combine diffusive and jump characteristics. Then, following a large-deviations formalism in the style of Schroedinger, given a prior law that allows for losses, we explore the most probable evolution of particles along with the most likely killing rate as the particles transition between the specified marginals. Our approach differs sharply from previous work involving a Feynman-Kac multiplicative reweighing of the reference measure which, as we argue, is far from Schroedinger’s quest. We develop a suitable Schroedinger system of coupled PDEs' for this problem, an iterative Fortet-IPF-Sinkhorn algorithm for computations, and finally formulate and solve a related fluid-dynamic control problem for the flow of one-time marginals where both the drift and the new killing rate play the role of control variables. Joint work with Tryphon Georgiou and Michele Pavon.

Physics-inspired learning of differential equations from data.

Series
CDSNS Colloquium
Time
Friday, November 24, 2023 - 15:30 for 1 hour (actually 50 minutes)
Location
Skiles 249
Speaker
Matthew GoldenGeorgia Tech

Seminar is in-person. Zoom link available: https://gatech.zoom.us/j/91390791493?pwd=QnpaWHNEOHZTVXlZSXFkYTJ0b0Q0UT0... />

Continuum theories of physics are traditionally described by local partial differential equations (PDEs). In this talk I will discuss the Sparse Physics-Informed Discovery of Empirical Relations (SPIDER) algorithm: a general algorithm combining the weak formulation, symmetry covariance, and sparse regression to discover quantitatively accurate and qualitatively simple PDEs directly from data. This method is applied to simulated 3D turbulence and experimental 2D active turbulence. A complete mathematical model is found in both cases.