Seminars and Colloquia by Series

Large Dimensional Independent Component Analysis: Statistical Optimality and Computational Tractability

Series
Stochastics Seminar
Time
Thursday, November 17, 2022 - 15:30 for 1 hour (actually 50 minutes)
Location
Skiles 006
Speaker
Ming YuanColumbia University

Independent component analysis is a useful and general data analysis tool. It has found great successes in many applications. But in recent years, it has been observed that many popular approaches to ICA do not scale well with the number of components. This debacle has inspired a growing number of new proposals. But it remains unclear what the exact role of the number of components is on the information theoretical limits and computational complexity for ICA. Here I will describe our recent work to specifically address these questions and introduce a refined method of moments that is both computationally tractable and statistically optimal.

Breaking the curse of dimensionality for boundary value PDE in high dimensions

Series
Stochastics Seminar
Time
Thursday, November 10, 2022 - 15:30 for 1 hour (actually 50 minutes)
Location
ONLINE
Speaker
Ionel PopescuUniversity of Bucharest and Simion Stoilow Institute of Mathematics

Zoom link to the seminar: https://gatech.zoom.us/j/91330848866

I will show how to construct a numerical scheme for solutions to linear Dirichlet-Poisson boundary problems which does not suffer of the curse of dimensionality. In fact we show that as the dimension increases, the complexity of this  scheme increases only (low degree) polynomially with the dimension. The key is a subtle use of walk on spheres combined with a concentration inequality. As a byproduct we show that this result has a simple consequence in terms of neural networks for the approximation of the solution. This is joint work with Iulian Cimpean, Arghir Zarnescu, Lucian Beznea and Oana Lupascu.

Fluctuation results for size of the vacant set for random walks on discrete torus

Series
Stochastics Seminar
Time
Thursday, November 3, 2022 - 15:30 for 1 hour (actually 50 minutes)
Location
Skiles 006
Speaker
Daesung KimGeorgia Tech

We consider a random walk on the $d\ge 3$ dimensional discrete torus starting from vertices chosen independently and uniformly at random. In this talk, we discuss the fluctuation behavior of the size of the range of the random walk trajectories at a time proportional to the size of the torus. The proof relies on a refined analysis of tail estimates for hitting time. We also discuss related results and open problems. This is based on joint work with Partha Dey.

Ballistic Annihilation

Series
Stochastics Seminar
Time
Thursday, October 27, 2022 - 15:30 for 1 hour (actually 50 minutes)
Location
Skiles 006
Speaker
Matthew JungeBaruch College, CUNY

In the late 20th century, statistical physicists introduced a chemical reaction model called ballistic annihilation. In it, particles are placed randomly throughout the real line and then proceed to move at independently sampled velocities. Collisions result in mutual annihilation. Many results were inferred by physicists, but it wasn’t until recently that mathematicians joined in. I will describe my trajectory through this model. Expect tantalizing open questions.

Statistical Tensor Learning in 2020s: Methodology, Theory, and Applications

Series
Stochastics Seminar
Time
Thursday, October 20, 2022 - 15:30 for 1 hour (actually 50 minutes)
Location
Skiles 006
Speaker
Anru ZhangDuke University

The analysis of tensor data, i.e., arrays with multiple directions, has become an active research topic in the era of big data. Datasets in the form of tensors arise from a wide range of scientific applications. Tensor methods also provide unique perspectives to many high-dimensional problems, where the observations are not necessarily tensors. Problems in high-dimensional tensors generally possess distinct characteristics that pose great challenges to the data science community. 

In this talk, we discuss several recent advances in statistical tensor learning and their applications in computational imaging, social network, and generative model. We also illustrate how we develop statistically optimal methods and computationally efficient algorithms that interact with the modern theories of computation, high-dimensional statistics, and non-convex optimization.

Efficient and Near-Optimal Online Portfolio Selection

Series
Stochastics Seminar
Time
Friday, October 14, 2022 - 14:00 for 1 hour (actually 50 minutes)
Location
Skiles 005
Speaker
Dmitrii M. OstrovskiiUniversity of Southern California

In the problem of online portfolio selection as formulated by Cover (1991), the trader repeatedly distributes her capital over $ d $ assets in each of $ T > 1 $ rounds, with the goal of maximizing the total return. Cover proposed an algorithm called Universal Portfolios, that performs nearly as well as the best (in hindsight) static assignment of a portfolio, with 

an $ O(d\log(T)) $ regret in terms of the logarithmic return. Without imposing any restrictions on the market, this guarantee is known to be worst-case optimal, and no other algorithm attaining it has been discovered so far. Unfortunately, Cover's algorithm crucially relies on computing the expectation over certain log-concave density in R^d, so in a practical implementation this expectation has to be approximated via sampling, which is computationally challenging. In particular, the fastest known implementation, proposed by Kalai and Vempala in 2002, runs in $ O( d^4 (T+d)^{14} ) $ per round, which rules out any practical application scenario. Proposing a practical algorithm with a near-optimal regret is a long-standing open problem. We propose an algorithm for online portfolio selection with a near-optimal regret guarantee of $ O( d \log(T+d) ) $ and the runtime of only $ O( d^2 (T+d) ) $ per round. In a nutshell, our algorithm is a variant of the follow-the-regularized-leader scheme, with a time-dependent regularizer given by the volumetric barrier for the sum of observed losses. Thus, our result gives a fresh perspective on the concept of volumetric barrier, initially proposed in the context of cutting-plane methods and interior-point methods, correspondingly by Vaidya (1989) and Nesterov and Nemirovski (1994). Our side contribution, of independent interest, is deriving the volumetrically regularized portfolio as a variational approximation of the universal portfolio: namely, we show that it minimizes Gibbs's free energy functional, with accuracy of order $ O( d \log(T+d) ) $. This is a joint work with Remi Jezequel and Pierre Gaillard. 

A stochastic approach for noise stability on the hypercube

Series
Stochastics Seminar
Time
Thursday, October 6, 2022 - 15:30 for 1 hour (actually 50 minutes)
Location
https://us02web.zoom.us/j/86578123009
Speaker
Dan MikulincerMIT

Please Note: Recording: https://us02web.zoom.us/rec/share/cIdTfvS0tjar04MWv9ltWrVxAcmsUSFvDznprSBT285wc0VzURfB3X8jR0CpWIWQ.Sz557oNX3k5L1cpN

We revisit the notion of noise stability in the hypercube and show how one can replace the usual heat semigroup with more general stochastic processes. We will then introduce a re-normalized Brownian motion, embedding the discrete hypercube into the Wiener space, and analyze the noise stability along its paths. Our approach leads to a new quantitative form of the 'Majority is Stablest' theorem from Boolean analysis and to progress on the 'most informative bit' conjecture of Kumar and Courtade.

Pages