Seminars and Colloquia by Series

Friday, April 6, 2018 - 10:00 , Location: Skiles 006 , Jaewoo Jung , Georgia Tech , , Organizer: Kisun Lee
H. Dao, C. Huneke, and J. Schweig provided a bound of the regularity of edge-ideals in their paper “Bounds on the regularity and projective dimension of ideals associated to graphs”. In this talk, we introduced their result briefly and talk about a bound of the regularity of Stanley-Reisner ideals using similar approach.
Thursday, April 5, 2018 - 15:05 , Location: Skiles 006 , Philippe Rigollet , MIT , Organizer: Mayya Zhilova
How should one estimate a signal, given only access to noisy versions of the signal corrupted by unknown cyclic shifts? This simple problem has surprisingly broad applications, in fields from aircraft radar imaging to structural biology with the ultimate goal of understanding the sample complexity of Cryo-EM. We describe how this model can be viewed as a multivariate Gaussian mixture model whose centers belong to an orbit of a group of orthogonal transformations. This enables us to derive matching lower and upper bounds for the optimal rate of statistical estimation for the underlying signal. These bounds show a striking dependence on the signal-to-noise ratio of the problem. We also show how a tensor based method of moments can solve the problem efficiently. Based on joint work with Afonso Bandeira (NYU), Amelia Perry (MIT), Amit Singer (Princeton) and Jonathan Weed (MIT).
Wednesday, April 4, 2018 - 14:00 , Location: Skiles 006 , Hongyi Zhou (Hugo) , GaTech , Organizer: Anubhav Mukherjee
Exotic sphere is a smooth manifold that is homeomorphic to, but not diffeomorphic to standard sphere. The simplest known example occurs in 7-dimension. I will recapitulate Milnor’s construction of exotic 7-sphere, by first constructing a candidate bundle M_{h,l}, then show that this manifold is a topological sphere with h+l=-1. There is an 8-dimensional bundle with M_{h,l} its boundary and if we glue an 8-disc to it to obtain a manifold without boundary, it should possess a natural differential structure. Failure to do so indicates that M_{h,l} cannot be mapped diffeomorphically to 7-sphere. Main tools used are Morse theory and characteristic classes.  
Monday, April 2, 2018 - 14:00 , Location: Skiles 006 , Linh Truong , Columbia University , Organizer: Jennifer Hom
Heegaard Floer homology has proven to be a useful tool in the study of knot concordance. Ozsvath and Szabo first constructed the tau invariant using the hat version of Heegaard Floer homology and showed it provides a lower bound on the slice genus. Later, Hom and Wu constructed a concordance invariant using the plus version of Heegaard Floer homology; this provides an even better lower-bound on the slice genus. In this talk, I discuss a sequence of concordance invariants that are derived from the truncated version of Heegaard Floer homology. These truncated Floer concordance invariants generalize the Ozsvath-Szabo and Hom-Wu invariants. 
Monday, April 2, 2018 - 13:55 , Location: Skiles 005 , Tuo Zhao , Georgia Institute of Technology , Organizer: Wenjing Liao
Nonconvex optimization naturally arises in many machine learning problems. Machine learning researchers exploit various nonconvex formulations to gain modeling flexibility, estimation robustness, adaptivity, and computational scalability. Although classical computational complexity theory has shown that solving nonconvex optimization is generally NP-hard in the worst case, practitioners have proposed numerous heuristic optimization algorithms, which achieve outstanding empirical performance in real-world applications.To bridge this gap between practice and theory, we propose a new generation of model-based optimization algorithms and theory, which incorporate the statistical thinking into modern optimization. Specifically, when designing practical computational algorithms, we take the underlying statistical models into consideration. Our novel algorithms exploit hidden geometric structures behind many nonconvex optimization problems, and can obtain global optima with the desired statistics properties in polynomial time with high probability.
Monday, April 2, 2018 - 11:15 , Location: skiles 005 , Manfred Heinz Denker , Penn State University , Organizer: Livia Corsi
Consider a $T$-preserving probability measure $m$ on a  dynamical system $T:X\to X$. The occupation time of a measurable function is the sequence $\ell_n(A,x)$   ($A\subset \mathbb R, x\in X$) defined as the number of $j\le n$ for which the partial sums $S_jf(x)\in A$.  The talk will discuss conditions which ensure that this sequence, properly normed, converges weakly to some limit distribution. It turns out that this distribution is Mittag-Leffler and in particular the result covers the case when $f\circ T^j$ is a fractal Gaussian noise of Hurst parameter $>3/4$.
Friday, March 30, 2018 - 15:00 , Location: Skiles 202 , Rui Han , IAS , Organizer: Michael Loss
Friday, March 30, 2018 - 15:00 , Location: Skiles 006 , Chethan Pandarinath , GT BME , Organizer: Sung Ha Kang
Since its inception, neuroscience has largely focused on the neuron as the functional unit of the nervous system. However, recent evidence demonstrates that populations of neurons within a brain area collectively show emergent functional properties ("dynamics"), properties that are not apparent at the level of individual neurons. These emergent dynamics likely serve as the brain’s fundamental computational mechanism. This shift compels neuroscientists to characterize emergent properties – that is, interactions between neurons – to understand computation in brain networks. Yet this introduces a daunting challenge – with millions of neurons in any given brain area, characterizing interactions within an area, and further, between brain areas, rapidly becomes intractable.I will demonstrate a novel unsupervised tool, Latent Factor Analysis via Dynamical Systems ("LFADS"), that can accurately and succinctly capture the emergent dynamics of large neural populations from limited sampling. LFADS is based around deep learning architectures (variational sequential auto-encoders), and builds a model of an observed neural population's dynamics using a nonlinear dynamical system (a recurrent neural network). When applied to neuronal ensemble recordings (~200 neurons) from macaque primary motor cortex (M1), we find that modeling population dynamics yields accurate estimates of the state of M1, as well as accurate predictions of the animal's motor behavior, on millisecond timescales. I will also demonstrate how our approach allows us to infer perturbations to the dynamical system (i.e., unobserved inputs to the neural population), and further allows us to leverage population recordings across long timescales (months) to build more accurate models of M1's dynamics.This approach demonstrates the power of deep learning tools to model nonlinear dynamical systems and infer accurate estimates of the states of large biological networks. In addition, we will discuss future directions, where we aim to pry open the "black box" of the trained recurrent neural networks, in order to understand the computations being performed by the modeled neural populations.pre-print available: [] 
Friday, March 30, 2018 - 15:00 , Location: Skiles 202 , Rui Han , Institute for Advanced Study , Organizer: Michael Loss
This talk will be focused on the large deviation theory (LDT) for Schr\"odinger cocycles over a quasi-periodic or skew-shift base. We will also talk about its connection to positivity and regularity of the Lyapunov exponent, as well as localization. We will also discuss some open problems of the skew-shift model.
Friday, March 30, 2018 - 14:00 , Location: Skiles 006 , Sudipta Kolay , Georgia Tech , Organizer: Sudipta Kolay
We will give eight different descriptions of the Poincaré homology sphere, and outline the proof of equivalence of the definitions.