Seminars and Colloquia by Series

Stochastic and Convex Geometry for the Analysis of Complex Data

Series
Job Candidate Talk
Time
Thursday, February 10, 2022 - 11:00 for 1 hour (actually 50 minutes)
Location
https://gatech.bluejeans.com/532559688
Speaker
Eliza O’ReillyCalifornia Institute of Technology

Many modern problems in data science aim to efficiently and accurately extract important features and make predictions from high dimensional and large data sets. While there are many empirically successful methods to achieve these goals, large gaps between theory and practice remain.  A geometric viewpoint is often useful to address these challenges as it provides a unifying perspective of structure in data, complexity of statistical models, and tractability of computational methods.  As a consequence, an understanding of problem geometry leads both to new insights on existing methods as well as new models and algorithms that address drawbacks in existing methodology.

 In this talk, I will present recent progress on two problems where the relevant model can be viewed as the projection of a lifted formulation with a simple stochastic or convex geometric description. In particular, I will first describe how the theory of stationary random tessellations in stochastic geometry can address computational and theoretical challenges of random decision forests with non-axis-aligned splits. Second, I will present a new approach to convex regression that returns non-polyhedral convex estimators compatible with semidefinite programming. These works open a number of future research directions at the intersection of stochastic and convex geometry, statistical learning theory, and optimization.

Understanding Statistical-vs-Computational Tradeoffs via Low-Degree Polynomials

Series
Job Candidate Talk
Time
Thursday, February 3, 2022 - 11:00 for 1 hour (actually 50 minutes)
Location
https://bluejeans.com/500115320/1408
Speaker
Alex WeinUC Berkeley/Simons Institute

A central goal in modern data science is to design algorithms for statistical inference tasks such as community detection, high-dimensional clustering, sparse PCA, and many others. Ideally these algorithms would be both statistically optimal and computationally efficient. However, it often seems impossible to achieve both these goals simultaneously: for many problems, the optimal statistical procedure involves a brute force search while all known polynomial-time algorithms are statistically sub-optimal (requiring more data or higher signal strength than is information-theoretically necessary). In the quest for optimal algorithms, it is therefore important to understand the fundamental statistical limitations of computationally efficient algorithms.

I will discuss an emerging theoretical framework for understanding these questions, based on studying the class of "low-degree polynomial algorithms." This is a powerful class of algorithms that captures the best known poly-time algorithms for a wide variety of statistical tasks. This perspective has led to the discovery of many new and improved algorithms, and also many matching lower bounds: we now have tools to prove failure of all low-degree algorithms, which provides concrete evidence for inherent computational hardness of statistical problems. This line of work illustrates that low-degree polynomials provide a unifying framework for understanding the computational complexity of a wide variety of statistical tasks, encompassing hypothesis testing, estimation, and optimization.

How to Break the Curse of Dimensionality

Series
Applied and Computational Mathematics Seminar
Time
Monday, January 31, 2022 - 14:00 for 1 hour (actually 50 minutes)
Location
https://bluejeans.com/457724603/4379
Speaker
Ming-Jun LaiUniversity of Georgia

We first review the problem of the curse of dimensionality when approximating multi-dimensional functions. Several approximation results from Barron, Petrushev,  Bach, and etc . will be explained. 

Then we present two approaches to break the curse of the dimensionality: one is based on probability approach explained in Barron, 1993 and the other one is based on a deterministic approach using the Kolmogorov superposition theorem.   As the Kolmogorov superposition theorem has been used to explain the approximation of neural network computation, I will use it to explain why the deep learning algorithm works for image classification.
In addition, I will introduce the neural network approximation based on higher order ReLU functions to explain the powerful approximation of multivariate functions using  deep learning algorithms with  multiple layers.

Stein property of complex-hyperbolic Kleinian groups

Series
Geometry Topology Seminar
Time
Monday, January 31, 2022 - 14:00 for 1 hour (actually 50 minutes)
Location
Online
Speaker
Subhadip DeyYale university

Let M be a complex-hyperbolic n-manifold, i.e. a quotient of the complex-hyperbolic n-space $\mathbb{H}^n_\mathbb{C}$ by a torsion-free discrete group of isometries, $\Gamma = \pi_1(M)$. Suppose that M is  convex-cocompact, i.e. the convex core of M is a nonempty compact subset. In this talk, we will discuss a sufficient condition on $\Gamma$ in terms of the growth-rate of its orbits in $\mathbb{H}^n_\mathbb{C}$ for which M is a Stein manifold. We will also talk about some interesting questions related to this result. This is a joint work with Misha Kapovich.

https://bluejeans.com/196544719/9518

On Gapped Ground State Phases of Quantum Lattice Models

Series
Job Candidate Talk
Time
Monday, January 31, 2022 - 11:00 for 1 hour (actually 50 minutes)
Location
ONLINE
Speaker
Amanda YoungTechnical University Munich

Quantum spin systems are many-body physical models where particles are bound to the sites of a lattice. These are widely used throughout condensed matter physics and quantum information theory, and are of particular interest in the classification of quantum phases of matter. By pinning down the properties of new exotic phases of matter, researchers have opened the door to developing new quantum technologies. One of the fundamental quantitites for this classification is whether or not the Hamiltonian has a spectral gap above its ground state energy in the thermodynamic limit. Mathematically, the Hamiltonian is a self-adjoint operator and the set of possible energies is given by its spectrum, which is bounded from below. While the importance of the spectral gap is well known, very few methods exist for establishing if a model is gapped, and the majority of known results are for one-dimensional systems. Moreover, the existence of a non-vanishing gap is generically undecidable which makes it necessary to develop new techniques for estimating spectral gaps. In this talk, I will discuss my work proving non-vanishing spectral gaps for key quantum spin models, and developing new techniques for producing lower bound estimates on the gap. Two important models with longstanding spectral gap questions that I recently contributed progress to are the AKLT model on the hexagonal lattice, and Haldane's pseudo-potentials for the fractional quantum Hall effect. Once a gap has been proved, a natural next question is whether it is typical of a gapped phase. This can be positively answered by showing that the gap is robust in the presence of perturbations. Ensuring the gap remains open in the presence of perturbations is also of interest, e.g., for the development of robust quantum memory. A second topic I will discuss is my research studying spectral gap stability.

URL for the talk: https://bluejeans.com/602513114/7767

 

 

Pages