Seminars and Colloquia by Series

Positive curvature implies existence of isoperimetric sets?

Series
Analysis Seminar
Time
Wednesday, January 24, 2024 - 14:00 for 1 hour (actually 50 minutes)
Location
Skiles 005
Speaker
Federico GlaudoPrinceton University

Over the past decade, a rich theory of existence for the isoperimetric problem in spaces of nonnegative curvature has been established by multiple authors.
We will briefly review this theory, with a special focus on the reasons why one may expect the isoperimetric problem to have a solution in any nonnegatively curved space: it is true for large enough volumes, it is true if the ambient is 2-dimensional, and it is true under appropriate assumptions on the ambient space at infinity.

The main topic of the talk will be the presentation of a counterexample to this "intuition": a 3-dimensional manifold of positive sectional curvature without isoperimetric sets for small volumes.
This is a joint work with G. Antonelli.

Three perspectives on B_3

Series
Geometry Topology Student Seminar
Time
Wednesday, January 24, 2024 - 14:00 for 1 hour (actually 50 minutes)
Location
Skiles 006
Speaker
Akash NarayananGeorgia Tech

Braid groups are relatively simple to describe, but they have deep and intricate connections to many different areas of math. We will discuss three specific instances where the braid group on 3 strands arises in geometry and knot theory. In exploring connections between these perspectives, we will take a detour into the world of elliptic curves and their moduli space. As a result, we will see that these three perspectives are actually the same. Time permitting, we will explore generalizations of this to the braid group on n strands for n > 3.

Persistence of spatial analyticity in 3D hyper-dissipative Navier-Stokes models

Series
PDE Seminar
Time
Tuesday, January 23, 2024 - 14:00 for 1 hour (actually 50 minutes)
Location
Skiles 005
Speaker
Zoran Grujic University of Alabama Birmingham

It has been known since the pioneering work of J.L. Lions in 1960s that 3D hyper-dissipative (HD) Navier-Stokes (NS) system exhibits global-in-time regularity as long as the hyper-diffusion exponent is greater or equal to 5/4.  One should note that at 5/4, the system is critical, i.e., the energy level and the scaling -invariant level coincide. What happens in the super-critical regime, the hyper-diffusion exponent being strictly between 1 and 5/4 remained a mystery. 

 

The goal of this talk is to demonstrate that as soon as the hyper-diffusion exponent is greater than 1, a class of monotone blow-up scenarios consistent with the analytic structure of the flow (prior to the possible singular time) can be ruled out (a sort of 'runaway train' scenario). The argument is in the spirit of the regularity theory of the 3D HD NS system in 'turbulent scenario' (in the super-critical regime) developed by Grujic and Xu, relying on 'dynamic interpolation' – however, it is much shorter, tailored to the class of blow-up profiles in view. This is a joint work with Aseel Farhat.

Topology, geometry and adaptivity in soft and living matter

Series
Job Candidate Talk
Time
Tuesday, January 23, 2024 - 14:00 for 1 hour (actually 50 minutes)
Location
Skiles 006
Speaker
Vishal PatilStanford University

Title: Topology, geometry and adaptivity in soft and living matter

Abstract:

Topology and adaptivity play fundamental roles in controlling the dynamics of biological and physical systems, from chromosomal DNA and biofilms to cilia carpets and worm collectives. How topological rules govern the self-adaptive dynamics of living matter remains poorly understood. Here we investigate the interplay between topology, geometry and reconfigurability in knotted and tangled matter. We first identify topological counting rules which predict the relative mechanical stability of human-designed knots, by developing a mapping between elastic knots and long-range ferromagnetic spin systems. Building upon this framework, we then examine the adaptive topological dynamics exhibited by California blackworms, which form living tangled structures in minutes but can rapidly untangle in milliseconds. Using blackworm locomotion datasets, we construct stochastic trajectory equations that explain how the dynamics of individual active filaments controls their emergent topological state. To further understand how tangled matter, along with more general biological networks, adapt to their surroundings, we introduce a theory of adaptive elastic networks which can learn mechanical information. By identifying how topology and adaptivity produce stable yet responsive structures, these results have applications in understanding broad classes of adaptive, self-optimizing biological systems.

 

Zoom: https://gatech.zoom.us/j/93619173236?pwd=ZGNRZUZ2emNJbG5pRzgzMnlFL1dzQT09

 

 

Optimization in Data Science: Enhancing Autoencoders and Accelerating Federated Learning

Series
SIAM Student Seminar
Time
Monday, January 22, 2024 - 14:00 for 1 hour (actually 50 minutes)
Location
Skiles 005
Speaker
Xue FengUC Davis

In this presentation, I will discuss my research in the field of data science, specifically in two areas: improving autoencoder interpolations and accelerating federated learning algorithms. My work combines advanced mathematical concepts with practical machine learning applications, contributing to both the theoretical and applied aspects of data science. The first part of my talk focuses on image sequence interpolation using autoencoders, which are essential tools in generative modeling. The focus is when there is only limited training data. By introducing a novel regularization term based on dynamic optimal transport to the loss function of autoencoder, my method can generate more robust and semantically coherent interpolation results. Additionally, the trained autoencoder can be used to generate barycenters. However, computation efficiency is a bottleneck of our method, and we are working on improving it. The second part of my presentation focuses on accelerating federated learning (FL) through the application of Anderson Acceleration. Our method achieves the same level of convergence performance as state-of-the-art second-order methods like GIANT by reweighting the local points and their gradients. However, our method only requires first-order information, making it a more practical and efficient choice for large-scale and complex training problems. Furthermore, our method is theoretically guaranteed to converge to the global minimizer with a linear rate.

Max-Intersection Completeness of Neural Codes and the Neural Ideal

Series
Algebra Seminar
Time
Monday, January 22, 2024 - 13:00 for 1 hour (actually 50 minutes)
Location
Skiles 005
Speaker
Alexander Ruys de PerezGeorgia Tech

Please Note: There will be a pre-seminar (aimed toward grad students and postdocs) from 11:30 am to noon in Skiles 005.

A neural code C on n neurons is a collection of subsets of {1,2,...,n} which is used to encode the intersections of subsets U_1, U_2,...,U_n of some topological space. The study of neural codes reveals the ways in which geometric or topological properties can be encoded combinatorially. A prominent example is the property of max-intersection completeness: if a code C contains every possible intersection of its maximal codewords, then one can always find a collection of open convex U_1, U_2,..., U_n for which C is the code. In this talk I will answer a question posed by Curto et al. (2018), which asks if there is a way of determining max-intersection completeness from examination of the neural ideal, an algebraic counterpart to the neural code.

Symmetry-Preserving Machine Learning: Theory and Applications

Series
Job Candidate Talk
Time
Thursday, January 18, 2024 - 11:00 for 1 hour (actually 50 minutes)
Location
Skiles 006
Speaker
Wei ZhuU Massachusetts Amherst

Symmetry is prevalent in a variety of machine learning and scientific computing tasks, including computer vision and computational modeling of physical and engineering systems. Empirical studies have demonstrated that machine learning models designed to integrate the intrinsic symmetry of their tasks often exhibit substantially improved performance. Despite extensive theoretical and engineering advancements in symmetry-preserving machine learning, several critical questions remain unaddressed, presenting unique challenges and opportunities for applied mathematicians.

Firstly, real-world symmetries rarely manifest perfectly and are typically subject to various deformations. Therefore, a pivotal question arises: Can we effectively quantify and enhance the robustness of models to maintain an “approximate” symmetry, even under imperfect symmetry transformations? Secondly, although empirical evidence suggests that symmetry-preserving models require fewer training data to achieve equivalent accuracy, there is a need for more precise and rigorous quantification of this reduction in sample complexity attributable to symmetry preservation. Lastly, considering the non-convex nature of optimization in modern machine learning, can we ascertain whether algorithms like gradient descent can guide symmetry-preserving models to indeed converge to objectively better solutions compared to their generic counterparts, and if so, to what degree?

In this talk, I will provide an overview of my research addressing these intriguing questions. Surprisingly, the answers are not as straightforward as one might assume and, in some cases, are counterintuitive. My approach employs an interesting blend of applied probability, harmonic analysis, differential geometry, and optimization. However, specialized knowledge in these areas is not required. 

Hidden Convexity, Rotation Matrices, and Algebraic Topology

Series
Algebra Student Seminar
Time
Thursday, January 18, 2024 - 11:00 for 1 hour (actually 50 minutes)
Location
Clough 262
Speaker
Kevin ShuGeorgia Tech

This talk will describe connections between algebraic geometry, convex geometry and algebraic topology. We will be discussing linear projections of the special orthogonal  group and when they are convex (in the sense that every pair of points in the image of the projection are connected by a line segment contained in the projection). In particular, I'll give a proof of the fact that the image of SO(n) under any linear map to R^2 is convex using some elementary homotopy theory. These kinds of question are not only geometrically interesting but are also useful in solving some optimization problems involved in space travel.

Finite Generation of the Terms of the Johnson Filtration

Series
Geometry Topology Student Seminar
Time
Wednesday, January 17, 2024 - 14:00 for 1 hour (actually 50 minutes)
Location
Skiles 006
Speaker
Dan MinahanGeorgia Tech

The Johnson filtration is a filtration of the mapping class group induced by the action of the mapping class group on the lower central series of the fundamental group of a surface.  A theorem of Johnson tells us that the first term of this filtration, called the Torelli group, is finitely generated for surfaces of genus at least 3.  We will explain work of Ershov—He and Church—Ershov—Putman, which uses Johnson's result to show that the kth term of the Johnson filtration is finitely generated for surfaces of genus g at least 2k - 1.  Time permitting, we will also discuss some extensions of these ideas.  In particular, we will explain how to show that the terms of the Johnson filtration are finitely presented assuming the Torelli group is finitely presented.

Global Solutions For Systems of Quadratic Nonlinear Schrödinger Equations in 3D

Series
PDE Seminar
Time
Tuesday, January 16, 2024 - 14:00 for 1 hour (actually 50 minutes)
Location
Skiles 005
Speaker
Boyang SuUniversity of Chicago


The existence of global solutions for the Schrödinger equation
 i\partial_t u + \Delta u = P_d(u),
with nonlinearity $P_d$ homogeneous of degree $d$, has been extensively studied. Most results focus on the case with gauge invariant nonlinearity, where the solution satisfies several conservation laws. However, the problem becomes more complicated as we consider a general nonlinearity $P_d$. So far, global well-posedness for small data is known for $d$ strictly greater than the Strauss exponent. In dimension $3$, this Strauss exponent is $2$, making NLS with quadratic nonlinearity an interesting topic.

In this talk, I will present a result that shows the global existence and scattering for systems of quadratic NLS for small, localized data. To tackle the challenge presented by the $u\Bar{u}$-type nonlinearity, we require an $\epsilon$ regularization for the terms of this type in the system.
 

Pages