Seminars and Colloquia by Series

Exotic 7-sphere

Series
Geometry Topology Student Seminar
Time
Wednesday, April 4, 2018 - 14:00 for 1 hour (actually 50 minutes)
Location
Skiles 006
Speaker
Hongyi Zhou (Hugo)GaTech
Exotic sphere is a smooth manifold that is homeomorphic to, but not diffeomorphic to standard sphere. The simplest known example occurs in 7-dimension. I will recapitulate Milnor’s construction of exotic 7-sphere, by first constructing a candidate bundle M_{h,l}, then show that this manifold is a topological sphere with h+l=-1. There is an 8-dimensional bundle with M_{h,l} its boundary and if we glue an 8-disc to it to obtain a manifold without boundary, it should possess a natural differential structure. Failure to do so indicates that M_{h,l} cannot be mapped diffeomorphically to 7-sphere. Main tools used are Morse theory and characteristic classes.

Truncated Heegaard Floer homology and concordance invariants

Series
Geometry Topology Seminar
Time
Monday, April 2, 2018 - 14:00 for 1 hour (actually 50 minutes)
Location
Skiles 006
Speaker
Linh TruongColumbia University
Heegaard Floer homology has proven to be a useful tool in the study of knot concordance. Ozsvath and Szabo first constructed the tau invariant using the hat version of Heegaard Floer homology and showed it provides a lower bound on the slice genus. Later, Hom and Wu constructed a concordance invariant using the plus version of Heegaard Floer homology; this provides an even better lower-bound on the slice genus. In this talk, I discuss a sequence of concordance invariants that are derived from the truncated version of Heegaard Floer homology. These truncated Floer concordance invariants generalize the Ozsvath-Szabo and Hom-Wu invariants.

Compute Faster and Learn Better: Machine Learning via Nonconvex Optimization

Series
Applied and Computational Mathematics Seminar
Time
Monday, April 2, 2018 - 13:55 for 1 hour (actually 50 minutes)
Location
Skiles 005
Speaker
Tuo ZhaoGeorgia Institute of Technology
Nonconvex optimization naturally arises in many machine learning problems. Machine learning researchers exploit various nonconvex formulations to gain modeling flexibility, estimation robustness, adaptivity, and computational scalability. Although classical computational complexity theory has shown that solving nonconvex optimization is generally NP-hard in the worst case, practitioners have proposed numerous heuristic optimization algorithms, which achieve outstanding empirical performance in real-world applications.To bridge this gap between practice and theory, we propose a new generation of model-based optimization algorithms and theory, which incorporate the statistical thinking into modern optimization. Specifically, when designing practical computational algorithms, we take the underlying statistical models into consideration. Our novel algorithms exploit hidden geometric structures behind many nonconvex optimization problems, and can obtain global optima with the desired statistics properties in polynomial time with high probability.

Occupation times

Series
CDSNS Colloquium
Time
Monday, April 2, 2018 - 11:15 for 1 hour (actually 50 minutes)
Location
skiles 005
Speaker
Manfred Heinz DenkerPenn State University
Consider a $T$-preserving probability measure $m$ on a dynamical system $T:X\to X$. The occupation time of a measurable function is the sequence $\ell_n(A,x)$ ($A\subset \mathbb R, x\in X$) defined as the number of $j\le n$ for which the partial sums $S_jf(x)\in A$. The talk will discuss conditions which ensure that this sequence, properly normed, converges weakly to some limit distribution. It turns out that this distribution is Mittag-Leffler and in particular the result covers the case when $f\circ T^j$ is a fractal Gaussian noise of Hurst parameter $>3/4$.

Unsupervised discovery of ensemble dynamics in the brain using deep learning techniques

Series
GT-MAP Seminar
Time
Friday, March 30, 2018 - 15:00 for 2 hours
Location
Skiles 006
Speaker
Chethan PandarinathGT BME
Since its inception, neuroscience has largely focused on the neuron as the functional unit of the nervous system. However, recent evidence demonstrates that populations of neurons within a brain area collectively show emergent functional properties ("dynamics"), properties that are not apparent at the level of individual neurons. These emergent dynamics likely serve as the brain’s fundamental computational mechanism. This shift compels neuroscientists to characterize emergent properties – that is, interactions between neurons – to understand computation in brain networks. Yet this introduces a daunting challenge – with millions of neurons in any given brain area, characterizing interactions within an area, and further, between brain areas, rapidly becomes intractable.I will demonstrate a novel unsupervised tool, Latent Factor Analysis via Dynamical Systems ("LFADS"), that can accurately and succinctly capture the emergent dynamics of large neural populations from limited sampling. LFADS is based around deep learning architectures (variational sequential auto-encoders), and builds a model of an observed neural population's dynamics using a nonlinear dynamical system (a recurrent neural network). When applied to neuronal ensemble recordings (~200 neurons) from macaque primary motor cortex (M1), we find that modeling population dynamics yields accurate estimates of the state of M1, as well as accurate predictions of the animal's motor behavior, on millisecond timescales. I will also demonstrate how our approach allows us to infer perturbations to the dynamical system (i.e., unobserved inputs to the neural population), and further allows us to leverage population recordings across long timescales (months) to build more accurate models of M1's dynamics.This approach demonstrates the power of deep learning tools to model nonlinear dynamical systems and infer accurate estimates of the states of large biological networks. In addition, we will discuss future directions, where we aim to pry open the "black box" of the trained recurrent neural networks, in order to understand the computations being performed by the modeled neural populations.pre-print available: lfads.github.io [lfads.github.io]

Large deviation estimates for ergodic Schr\"odinger cocycles

Series
Math Physics Seminar
Time
Friday, March 30, 2018 - 15:00 for 1 hour (actually 50 minutes)
Location
Skiles 202
Speaker
Rui HanInstitute for Advanced Study
This talk will be focused on the large deviation theory (LDT) for Schr\"odinger cocycles over a quasi-periodic or skew-shift base. We will also talk about its connection to positivity and regularity of the Lyapunov exponent, as well as localization. We will also discuss some open problems of the skew-shift model.

Some Corollaries about regularity of Stanley-Reisner ideals

Series
Student Algebraic Geometry Seminar
Time
Friday, March 30, 2018 - 10:00 for 1 hour (actually 50 minutes)
Location
Skiles 006
Speaker
Jaewoo JungGeorgia Tech
One way to analyze a module is to consider its minimal free resolution and take a look its Betti numbers. In general, computing minimal free resolution is not so easy, but in case of some certain modules, computing the Betti numbers become relatively easy by using a Hochster's formula (with the associated simplicial complex. Besides, Mumford introduced Castelnuovo-Mumford regularity. The regularity controls when the Hilbert function of the variety becomes a polynomial. (In other words, the regularity represents how much the module is irregular). We can define the regularity in terms of Betti numbers and we may see some properties for some certain ideals using its associated simplicial complex and homology. In this talk, I will review the Stanley-Reisner ideals, the (graded) betti-numbers, and Hochster's formula. Also, I am going to introduce the Castelnuovo-Mumford regularity in terms of Betti numbers and then talk about a useful technics to analyze the Betti-table (using the Hochster's formula and Mayer-Vietories sequence).

On large multipartite subgraphs of H-free graphs

Series
Combinatorics Seminar
Time
Thursday, March 29, 2018 - 13:30 for 1 hour (actually 50 minutes)
Location
Skiles 005
Speaker
Jan VolecMcGill
A long-standing conjecture of Erdős states that any n-vertex triangle-free graph can be made bipartite by deleting at most n^2/25 edges. In this talk, we study how many edges need to be removed from an H-free graph for a general graph H. By generalizing a result of Sudakov for 4-colorable graphs H, we show that if H is 6-colorable then G can be made bipartite by deleting at most 4n^2/25+O(n) edges. In the case of H=K_6, we actually prove the exact bound 4n^2/25 and show that this amount is needed only in the case G is a complete 5-partite graph with balanced parts. As one of the steps in the proof, we use a strengthening of a result of Füredi on stable version of Turán's theorem. This is a joint work with P. Hu, B. Lidický, T. Martins-Lopez and S. Norin.

Pages