Seminars and Colloquia by Series

Brill-Noether Theory of Finite Graphs

Series
Algebra Student Seminar
Time
Friday, December 1, 2023 - 10:00 for 1 hour (actually 50 minutes)
Location
Skiles 005
Speaker
Noah SolomonGeorgia Tech

Come learn about chip firing games! While simple to define, these games provide surprisingly strong combinatorial tools for studying algebraic curves. Fueling this theory is a strong analogy between algebraic curves and finite graphs. In ways we will make more precise, many of the features of algebraic curves can be studied in graphs, however certain parts of the theory don’t make it through intact. In this talk we will focus on a central question in this analogy: which graphs are the best models for algebraic curves? We will set up the background needed to ask this question as well as the tools and techniques used to study such graphs. No prior knowledge of chip-firing or algebraic geometry needed.

Permutation limits

Series
Stochastics Seminar
Time
Thursday, November 30, 2023 - 15:30 for 1 hour (actually 50 minutes)
Location
Skiles 006
Speaker
Sumit MukherjeeColumbia University

Permutation limit theory arises by viewing a permutation as a probability measure on the unit square. Using the theory of permutation limits (permutons), we can compute limiting properties of various permutation statistics for random permutations, such as number of fixed points, number of small cycles, pattern counts, and degree distribution of permutation graphs. We can also derive LDPs for random permutations. Our results apply to many non uniform distributions on permutations, including the celebrated Mallows model, and mu-random permutations. This is based on joint work with Jacopo Borga, Sayan Das and Peter Winkler.

Higher higher Teichmüller spaces from tilings of convex domains

Series
Geometry Topology Student Seminar
Time
Wednesday, November 29, 2023 - 14:00 for 1 hour (actually 50 minutes)
Location
Skiles 006
Speaker
Alex NolteRice University

A sequence of remarkable results in recent decades have shown that for a surface group H there are many Lie groups G and connected components C of Hom(H,G) consisting of discrete and faithful representations. These are known as higher Teichmüller spaces. With two exceptions, all known constructions of higher Teichmüller spaces work only for surface groups. This is an expository talk on the remarkable paper Convexes Divisibles III (Benoist ‘05), in which the first construction of higher Teichmüller spaces that works for some non-surface-groups was discovered. The paper implies the fundamental group H’ of any closed hyperbolic n-manifold has a higher Teichmüller space C’ in PGL(n+1,R). This is proved by showing any element of C’ preserves a convex domain in RP^n with a group-invariant tiling.

Turán and Ramsey problems in vector spaces over finite fields

Series
Graph Theory Seminar
Time
Tuesday, November 28, 2023 - 15:30 for 1 hour (actually 50 minutes)
Location
Skiles 006
Speaker
Bryce FredericksonEmory University

Turán-type problems ask for the densest-possible structure which avoids a fixed substructure H. Ramsey-type problems ask for the largest possible "complete" structure which can be decomposed into a fixed number of H-free parts. We discuss some of these problems in the context of vector spaces over finite fields. In the Turán setting, Furstenberg and Katznelson showed that any constant-density subset of the affine space AG(n,q) must contain a k-dimensional affine subspace if n is large enough. On the Ramsey side of things, a classical result of Graham, Leeb, and Rothschild implies that any red-blue coloring of the projective space PG(n-1,q) must contain a monochromatic k-dimensional projective subspace, for n large. We highlight the connection between these results and show how to obtain new bounds in the latter (projective Ramsey) problem from bounds in the former (affine Turán) problem. This is joint work with Liana Yepremyan.

Sum-Product with few primes

Series
Additional Talks and Lectures
Time
Monday, November 27, 2023 - 16:00 for 1.5 hours (actually 80 minutes)
Location
Skiles 005
Speaker
Brandon HansonUniversity of Maine

This talk concerns improving sum-product exponents for sets  of integers under the condition that each element of  has no more than  prime factors. The argument combines combinatorics, harmonic analysis and number theory.

Generative Machine Learning Models for Uncertainty Quantification

Series
Applied and Computational Mathematics Seminar
Time
Monday, November 27, 2023 - 14:00 for 1 hour (actually 50 minutes)
Location
Skiles 005 and https://gatech.zoom.us/j/98355006347
Speaker
Feng BaoFlorida State University

Generative machine learning models, including variational auto-encoders (VAE), normalizing flows (NF), generative adversarial networks (GANs), diffusion models, have dramatically improved the quality and realism of generated content, whether it's images, text, or audio. In science and engineering, generative models can be used as powerful tools for probability density estimation or high-dimensional sampling that critical capabilities in uncertainty quantification (UQ), e.g., Bayesian inference for parameter estimation. Studies on generative models for image/audio synthesis focus on improving the quality of individual sample, which often make the generative models complicated and difficult to train. On the other hand, UQ tasks usually focus on accurate approximation of statistics of interest without worrying about the quality of any individual sample, so direct application of existing generative models to UQ tasks may lead to inaccurate approximation or unstable training process. To alleviate those challenges, we developed several new generative diffusion models for various UQ tasks, including diffusion-model-assisted supervised learning of generative models, a score-based nonlinear filter for recursive Bayesian inference, and a training-free ensemble score filter for tracking high dimensional stochastic dynamical systems. We will demonstrate the effectiveness of those methods in various UQ tasks including density estimation, learning stochastic dynamical systems, and data assimilation problems.

Chebyshev varieties

Series
Algebra Seminar
Time
Monday, November 27, 2023 - 13:00 for 1 hour (actually 50 minutes)
Location
Skiles 006
Speaker
Chiara MeroniHarvard John A. Paulson School of Engineering and Applied Sciences

Please Note: There will be a pre-seminar (aimed toward grad students and postdocs) from 11 am to 11:30 am in Skiles 006.

Chebyshev polynomials offer a natural basis for solving polynomial equations. When we switch from monomials to Chebyshev polynomials, we can replace toric varieties with Chebyshev varieties. We will introduce these objects and discuss their main properties, including equations, dimension, and degree. This is an ongoing project with Zaïneb Bel-Afia and Simon Telen.

Physics-inspired learning of differential equations from data.

Series
CDSNS Colloquium
Time
Friday, November 24, 2023 - 15:30 for 1 hour (actually 50 minutes)
Location
Skiles 249
Speaker
Matthew GoldenGeorgia Tech

Please Note: Seminar is in-person. Zoom link available: https://gatech.zoom.us/j/91390791493?pwd=QnpaWHNEOHZTVXlZSXFkYTJ0b0Q0UT09

Continuum theories of physics are traditionally described by local partial differential equations (PDEs). In this talk I will discuss the Sparse Physics-Informed Discovery of Empirical Relations (SPIDER) algorithm: a general algorithm combining the weak formulation, symmetry covariance, and sparse regression to discover quantitatively accurate and qualitatively simple PDEs directly from data. This method is applied to simulated 3D turbulence and experimental 2D active turbulence. A complete mathematical model is found in both cases.

The most likely evolution of diffusing and vanishing particles: Schrodinger Bridges with unbalanced marginals

Series
PDE Seminar
Time
Tuesday, November 21, 2023 - 15:30 for 1 hour (actually 50 minutes)
Location
Skiles 005
Speaker
Yongxin ChenGeorgia Tech

Stochastic flows of an advective-diffusive nature are ubiquitous in biology and the physical sciences. Of particular interest is the problem to reconcile observed marginal distributions with a given prior posed by E. Schroedinger in 1932/32 and known as the Schroedinger Bridge Problem (SBP). It turns out that Schroedinger’s problem can be viewed as a problem in large deviations, a modeling problem, as well as a control problem. Due to the fundamental significance of this problem, interest in SBP and in its deterministic (zero-noise limit) counterpart of Optimal Transport (OT) has in recent years enticed a broad spectrum of disciplines, including physics, stochastic control, computer science, and geometry. Yet, while the mathematics and applications of SBP/OT have been developing at a considerable pace, accounting for marginals of unequal mass has received scant attention; the problem to interpolate between “unbalanced” marginals has been approached by introducing source/sink terms into the transport equations, in an adhoc manner, chiefly driven by applications in image registration. Nevertheless, losses are inherent in many physical processes and, thereby, models that account for lossy transport may also need to be reconciled with observed marginals following Schroedinger’s dictum; that is, to adjust the probability of trajectories of particles, including those that do not make it to the terminal observation point, so that the updated law represents the most likely way that particles may have been transported, or vanished, at some intermediate point. Thus, the purpose of this talk is to present recent results on stochastic evolutions with losses, whereupon particles are “killed” (jump into a coffin/extinction state) according to a probabilistic law, and thereby mass is gradually lost along their stochastically driven flow. Through a suitable embedding we turn the problem into an SBP for stochastic processes that combine diffusive and jump characteristics. Then, following a large-deviations formalism in the style of Schroedinger, given a prior law that allows for losses, we explore the most probable evolution of particles along with the most likely killing rate as the particles transition between the specified marginals. Our approach differs sharply from previous work involving a Feynman-Kac multiplicative reweighing of the reference measure which, as we argue, is far from Schroedinger’s quest. We develop a suitable Schroedinger system of coupled PDEs' for this problem, an iterative Fortet-IPF-Sinkhorn algorithm for computations, and finally formulate and solve a related fluid-dynamic control problem for the flow of one-time marginals where both the drift and the new killing rate play the role of control variables. Joint work with Tryphon Georgiou and Michele Pavon.

Machine learning, optimization, & sampling through a geometric lens

Series
School of Mathematics Colloquium
Time
Monday, November 20, 2023 - 14:00 for 1 hour (actually 50 minutes)
Location
Skiles 005 and https://gatech.zoom.us/j/98355006347
Speaker
Suvrit SraMIT & TU Munich

Please Note: Joint {School of Math Colloquium} and {Applied & Computational Math Seminar}. Note: *special time*. Speaker will present in person.

Geometry arises in myriad ways within machine learning and related areas. I this talk I will focus on settings where geometry helps us understand problems in machine learning, optimization, and sampling. For instance, when sampling from densities supported on a manifold, understanding geometry and the impact of curvature are crucial; surprisingly, progress on geometric sampling theory helps us understand certain generalization properties of SGD for deep-learning! Another fascinating viewpoint afforded by geometry is in non-convex optimization: geometry can either help us make training algorithms more practical (e.g., in deep learning), it can reveal tractability despite non-convexity (e.g., via geodesically convex optimization), or it can simply help us understand existing methods better (e.g., SGD, eigenvector computation, etc.).

Ultimately, I hope to offer the audience some insights into geometric thinking and share with them some new tools that help us design, understand, and analyze models and algorithms. To make the discussion concrete I will recall a few foundational results arising from our research, provide several examples, and note some open problems.

––
Bio: Suvrit Sra is a Alexander von Humboldt Professor of Artificial Intelligence at the Technical University of Munich (Germany), and and Associate Professor of EECS at MIT (USA), where he is also a member of the Laboratory for Information and Decision Systems (LIDS) and of the Institute for Data, Systems, and Society (IDSS). He obtained his PhD in Computer Science from the University of Texas at Austin. Before TUM & MIT, he was a Senior Research Scientist at the Max Planck Institute for Intelligent Systems, Tübingen, Germany. He has held visiting positions at UC Berkeley (EECS) and Carnegie Mellon University (Machine Learning Department) during 2013-2014. His research bridges mathematical topics such as differential geometry, matrix analysis, convex analysis, probability theory, and optimization with machine learning. He founded the OPT (Optimization for Machine Learning) series of workshops, held from OPT2008–2017 at the NeurIPS  conference. He has co-edited a book with the same name (MIT Press, 2011). He is also a co-founder and chief scientist of Pendulum, a global AI+logistics startup.

 

Pages