Seminars and Colloquia by Series

Analyzing developmentally-mediated transitions in patterns of human sleep under homeostatic and circadian variation: A mathematical modeling approach

Series
Research Horizons Seminar
Time
Wednesday, September 28, 2022 - 12:30 for 1 hour (actually 50 minutes)
Location
Skiles 006
Speaker
Christina AthanasouliGeorgia Institute of Technology

Sleep and wake states are driven by interactions of neuronal populations in many areas of the human brain, such as the brainstem, midbrain, hypothalamus, and basal forebrain. The timing of human sleep is strongly modulated by the 24 h circadian rhythm and the homeostatic sleep drive, the need for sleep that depends on the history of prior awakening. The parameters dictating the evolution of the homeostatic sleep drive may vary with development or interindividual characteristics and have been identified as important parameters for generating the transition from multiple sleeps to a single sleep episode per day. Features of the mean firing rate of the neurons in the suprachiasmatic nucleus (SCN), the central pacemaker in humans, may differ with seasonality. In this talk, I will present our analysis of changes in sleep patterning under variation of homeostatic and circadian parameters using a mathematical model for human sleep-wake regulation. I will also talk about the fundamental tools we employ to understand the dynamics of the model, such as the construction of a circle map that captures the timing of sleep onsets on successive days. Analysis of the structure and bifurcations in the map reveals changes in the average number of sleep episodes per circadian day in a period-adding-like structure caused by the separate or combined effects of circadian and homeostatic variation. Time permitting, I will talk about some of our current work on modeling sleep patterns in early childhood using experimental data.

Hardy spaces for Fourier integral operators

Series
PDE Seminar
Time
Tuesday, September 27, 2022 - 15:00 for 1 hour (actually 50 minutes)
Location
Online: https://gatech.zoom.us/j/95574359880?pwd=cGpCa3J1MFRkY0RUeU1xVFJRV0x3dz09
Speaker
Jan RozendaalIMPAN

It is well known that the wave operators cos(t (−∆)) and sin(t (−∆)) are not bounded on Lp(Rn), for n≥2 and 1≤p≤∞, unless p=2 or t=0. In fact, for 1 < p < ∞ these operators are bounded from W2s(p),p  to Lp(Rn) for s(p) := (n−1)/2 | 1/p − 1/2 |, and this exponent cannot be improved. This phenomenon  is symptomatic of the behavior of Fourier integral operators, a class of oscillatory operators which includes wave propagators, on Lp(Rn).

In this talk, I will introduce a class of Hardy spaces HFIOp (Rn), for p ∈ [1,∞],on which Fourier integral operators of order zero are bounded. These spaces also satisfy Sobolev embeddings which allow one to recover the optimal boundedness results for Fourier integral operators on Lp(Rn).

However, beyond merely recovering existing results, the invariance of these spaces under Fourier integral operators allows for iterative constructions that are not possible when working directly on Lp(Rn). In particular, we shall indicate how one can use this invariance to obtain the optimal fixed-time Lp regularity for wave equations with rough coefficients. We shall also mention the connection of these spaces to the phenomenon of local smoothing.

This talk is based on joint work with Andrew Hassell and Pierre Portal (Aus- tralian National University), and Zhijie Fan, Naijia Liu and Liang Song (Sun Yat- Sen University).

The stable cohomology of the level-l subgroup of the mapping class group (Joint Topology Seminar @ UGA)

Series
Geometry Topology Seminar
Time
Monday, September 26, 2022 - 16:30 for 1 hour (actually 50 minutes)
Location
University of Georgia (Boyd 322)
Speaker
Andrew PutmanNotre Dame

After an introduction to how to think about the mapping class groupand its cohomology, I will discuss a recent theorem of mine saying
that passing to the level-l subgroup does not change the rational cohomology in a stable range.

Obstructions to reversing Lagrangian surgery (Joint Topology Seminar @ UGA)

Series
Geometry Topology Seminar
Time
Monday, September 26, 2022 - 15:00 for 1 hour (actually 50 minutes)
Location
University of Georgia (Boyd 322)
Speaker
Orsola Capovilla SearleUC Davis

Given an immersed, Maslov-0, exact Lagrangian filling of a Legendrian knot, if the filling has a vanishing index and action double point, then through Lagrangian surgery it is possible to obtain a new immersed, Maslov-0, exact Lagrangian filling with one less double point and with genus increased by one. We show that it is not always possible to reverse the Lagrangian surgery: not every immersed, Maslov-0, exact Lagrangian filling with genus g ≥ 1 and p double points can be obtained from such a Lagrangian surgery on a filling of genus g − 1 with p+1 double points. To show this, we establish the connection between the existence of an immersed, Maslov-0, exact Lagrangian filling of a Legendrian Λ that has p double points with action 0 and the existence of an embedded, Maslov-0, exact Lagrangian cobordism from p copies of a Hopf link to Λ. We then prove that a count of augmentations provides an obstruction to the existence of embedded, Maslov-0, exact Lagrangian cobordisms between Legendrian links. Joint work with Noemie Legout, Maylis Limouzineau, Emmy Murphy, Yu Pan and Lisa Traynor.

Solving decomposable sparse polynomial systems

Series
Algebra Seminar
Time
Monday, September 26, 2022 - 13:30 for 1 hour (actually 50 minutes)
Location
Clough 125 Classroom
Speaker
Thomas YahlTAMU

Polynomial systems can be effectively solved by exploiting structure present in their Galois group. Esterov determined two conditions for which the Galois group of a sparse polynomial system is imprimitive, and showed that the Galois group is the symmetric group otherwise. A system with an imprimitive Galois group can be decomposed into simpler systems, which themselves may be further decomposed. Esterov's conditions give a stopping criterion for decomposing these systems and leads to a recursive algorithm for efficient solving.

Automated computation of slow invariant manifolds of large-scale mechanical systems.

Series
CDSNS Colloquium
Time
Friday, September 23, 2022 - 15:30 for 1 hour (actually 50 minutes)
Location
Online via Zoom
Speaker
Alessandra VizzaccaroUniversity of Bristol

Please Note: Zoom link: https://us06web.zoom.us/j/83392531099?pwd=UHh2MDFMcGErbzFtMHBZTmNZQXM0dz09

Abstract: In the field of structural dynamics, engineers heavily rely on high-fidelity models of the structure at hand to predict its dynamic response and identify potential threats to its integrity.

The structure under investigation, be it an aircraft wing or a MEMS device, is typically discretised with finite elements, giving rise to a very large system of nonlinear ODEs. Due to the high dimensionality, the solution of such systems is very expensive in terms of computational time. For this reason, a large amount of literature in this field is devoted to the development of reduced order models of much lower dimensionality, able to accurately reproduce the original system’s dynamics. Not only the lower dimensionality increases the computational speed, but also provides engineers with interpretable and manageable models of complex systems, which can be easily coupled with data and uncertainty quantification, and whose parameter space can be easily explored. Slow invariant manifolds prove to be the perfect candidate for dimensionality reduction, however their computation for large scale systems has only been proposed in recent years (see Gonzalez et al. (2019), Haller et al. (2020), AV et al. (2019)).

In this talk, the Direct Parametrisation of Invariant Manifolds method (DPIM) will be presented. The theoretical basis of the method is provided by the results of Cabré, Fontich and de la Llave and its algorithmic implementation relies on the parametrisation method for invariant manifolds proposed by Haro et al.. The idea is to parametrise the invariant manifold around a fixed point through a power series expansion which can be solved recursively for each monomial in the reduced coordinates. The main limitation of the original algorithm is the necessity to operate in diagonal representation, which is unfeasible for large finite element systems as it would require the computation of the whole eigenspectrum. The main novelty of the proposed method lies in the expression of the normal homological equation directly in physical coordinates, which is the key aspect that permits its application to large scale systems.

The talk will focus on problems in structural dynamics in both autonomous and nonautonomous settings. The accuracy of the reduction will be shown on several examples, covering phenomena like internal resonances and parametric resonances. Finally, the current limitations and future developments of the method will be discussed.

 

Embeddings of lens spaces and rational homology balls in complex projective space

Series
Geometry Topology Working Seminar
Time
Friday, September 23, 2022 - 14:00 for 1 hour (actually 50 minutes)
Location
Skiles 006
Speaker
Agniva RoyGeorgia Tech

Given a symplectic 4 manifold and a contact 3 manifold, it is natural to ask whether the latter embeds in the former as a contact type hypersurface. We explore this question for CP^2 and lens spaces. In this talk, we will consider the background necessary for an approach to this problem. Specifically, we will survey some essential notions and terminology related to low-dimensional contact and symplectic topology. These will involve Dehn surgery, tightness, overtwistedness, concave and convex symplectic fillings, and open book decompositions. We will also look at some results about these and mention some research trends.

Determinant Maximization via Matroid Intersection Algorithms

Series
ACO Student Seminar
Time
Friday, September 23, 2022 - 13:00 for 1 hour (actually 50 minutes)
Location
Skiles 005
Speaker
Aditi Laddha

Determinant maximization problem gives a general framework that models problems arising in as diverse fields as statistics, convex geometry, fair allocations, combinatorics, spectral graph theory, network design, and random processes. In an instance of a determinant maximization problem, we are given a collection of vectors $U = {v_1, \ldots, v_n}$ in $d$ dimensions, and a goal is to pick a subset $S$ of given vectors to maximize the determinant of the matrix $\sum_{i \in S} v_i v_i^T$. Often, the set $S$ of picked vectors must satisfy additional combinatorial constraints such as cardinality constraint ($|S| \leq k$) or matroid constraint ($S$ is a basis of a matroid defined on the vectors). In this talk, we give a polynomial-time deterministic algorithm that returns an $r^{O(r)}$-approximation for any matroid of rank $r \leq d$. Our algorithm builds on combinatorial algorithms for matroid intersection, which iteratively improves any solution by finding an alternating negative cycle in the exchange graph defined by the matroids. While the determinant function is not linear, we show that taking appropriate linear approximations at each iteration suffice to give the improved approximation algorithm.

 

This talk is based on joint work with Adam Brown, Madhusudhan Pittu, Mohit Singh, and Prasad Tetali.

Efficient parameterization of invariant manifolds using deep neural networks

Series
Time
Friday, September 23, 2022 - 11:00 for 1 hour (actually 50 minutes)
Location
Online
Speaker
Shane KepleyVU

https://gatech.zoom.us/j/95197085752?pwd=WmtJUVdvM1l6aUJBbHNJWTVKcVdmdz09

Spectral methods are the gold standard for parameterizing manifolds of solutions for ODEs because of their high precision and amenability to computer assisted proofs. However, these methods suffer from several drawbacks. In particular, the parameterizations are costly to compute and time-stepping is far more complicated than other methods. In this talk we demonstrate how computing these parameterizations and accurately time-stepping can be reduced to a related manifold learning problem. The latter problem is solved by training a deep neural network to interpolate charts for a low dimensional manifold embedded in a high dimensional Euclidean space. This training is highly parallelizable and need only be performed once. Once the neural network is trained, it is capable of parameterizing invariant manifolds for the ODE and time-stepping with remarkable efficiency and precision.

Sparse Quadratic Programs via Polynomial Roots

Series
Algebra Student Seminar
Time
Friday, September 23, 2022 - 10:00 for 1 hour (actually 50 minutes)
Location
Skiles 006
Speaker
Kevin ShuGeorgia Institute of Technology

We'll talk about problems of optimizing a quadratic function subject to quadratic constraints, in addition to a sparsity constraint that requires that solutions have only a few nonzero entries. Such problems include sparse versions of linear regression and principal components analysis. We'll see that this problem can be formulated as a convex conical optimization problem over a sparse version of the positive semidefinite cone, and then see how we can approximate such problems using ideas arising from the study of hyperbolic polynomials. We'll also describe a fast algorithm for such problems, which performs well in practical situations.

Pages