Seminars and Colloquia by Series

Random Neural Networks with applications to Image Recovery

Series
Stochastics Seminar
Time
Thursday, April 11, 2019 - 15:05 for 1 hour (actually 50 minutes)
Location
Skiles 006
Speaker
Paul HandNortheastern University
Neural networks have led to new and state of the art approaches for image recovery. They provide a contrast to standard image processing methods based on the ideas of sparsity and wavelets. In this talk, we will study two different random neural networks. One acts as a model for a learned neural network that is trained to sample from the distribution of natural images. Another acts as an unlearned model which can be used to process natural images without any training data. In both cases we will use high dimensional concentration estimates to establish theory for the performance of random neural networks in imaging problems.

Constructive regularization of the random matrix norm

Series
Stochastics Seminar
Time
Thursday, March 28, 2019 - 15:05 for 1 hour (actually 50 minutes)
Location
Skiles 006
Speaker
Liza RebovaMathematics, UCLA

I will talk about the structure of large square random matrices with centered i.i.d. heavy-tailed entries (only two finite moments are assumed). In our previous work with R. Vershynin we have shown that the operator norm of such matrix A can be reduced to the optimal sqrt(n)-order with high probability by zeroing out a small submatrix of A, but did not describe the structure of this "bad" submatrix, nor provide a constructive way to find it. Now we can give a very simple description of this small "bad" subset: it is enough to zero out a small fraction of the rows and columns of A with largest L2 norms to bring its operator norm to the almost optimal sqrt(loglog(n)*n)-order, under additional assumption that the entries of A are symmetrically distributed. As a corollary, one can also obtain a constructive procedure to find a small submatrix of A that one can zero out to achieve the same regularization.
Im am planning to discuss some details of the proof, the main component of which is the development of techniques that extend constructive regularization approaches known for the Bernoulli matrices (from the works of Feige and Ofek, and Le, Levina and Vershynin) to the considerably broader class of heavy-tailed random matrices.

Iterative linear solvers and random matrices: new bounds for the block Gaussian sketch and project method

Series
Stochastics Seminar
Time
Wednesday, March 27, 2019 - 15:00 for 1 hour (actually 50 minutes)
Location
Skiles 006
Speaker
Liza RebrovaUCLA

One of the most famous methods for solving large-scale over-determined linear systems is Kaczmarz algorithm, which iteratively projects the previous approximation x_k onto the solution spaces of the next equation in the system. An elegant proof of the exponential convergence of this method using correct randomization of the process is due to Strohmer and Vershynin (2009). Many extensions and generalizations of the method were proposed since then, including the works of Needell, Tropp, Ward, Srebro, Tan and many others. An interesting unifying view on a number of iterative solvers (including several versions of the Kaczmarz algorithm) was proposed by Gower and Richtarik in 2016. The main idea of their sketch-and-project framework is the following: one can observe that the random selection of a row (or a row block) can be represented as a sketch, that is, left multiplication by a random vector (or a matrix), thereby pre-processing every iteration of the method, which is represented by a projection onto the image of the sketch.

I will give an overview of some of these methods, and talk about the role that random matrix theory plays in the showing their convergence. I will also discuss our new results with Deanna Needell on the block Gaussian sketch and project method.

TBA by

Series
Stochastics Seminar
Time
Thursday, March 14, 2019 - 15:05 for 1 hour (actually 50 minutes)
Location
Skiles 006
Speaker
TBASOM, GaTech

1-d parabolic Anderson model with rough spatial noise

Series
Stochastics Seminar
Time
Thursday, March 7, 2019 - 15:05 for 1 hour (actually 50 minutes)
Location
Skiles 006
Speaker
Samy TindelPurdue University
In this talk I will first recall some general facts about the parabolic Anderson model (PAM), which can be briefly described as a simple heat equation in a random environment. The key phenomenon which has to be observed in this context is called localization. I will review some ways to express this phenomenon, and then single out the so called eigenvectors localization for the Anderson operator. This particular instance of localization motivates our study of large time asymptotics for the stochastic heat equation. In the second part of the talk I will describe the Gaussian environment we consider, which is rougher than white noise, then I will give an account on the asymptotic exponents we obtain as time goes to infinity. If time allows it, I will also give some elements of proof.

On the reconstruction error of PCA

Series
Stochastics Seminar
Time
Tuesday, March 5, 2019 - 15:05 for 1 hour (actually 50 minutes)
Location
Skiles 168
Speaker
Martin WahlHumboldt University, Berlin.

We identify principal component analysis (PCA) as an empirical risk minimization problem with respect to the reconstruction error and prove non-asymptotic upper bounds for the corresponding excess risk. These bounds unify and improve existing upper bounds from the literature. In particular, they give oracle inequalities under mild eigenvalue conditions. We also discuss how our results can be transferred to the subspace distance and, for instance, how our approach leads to a sharp $\sin \Theta$ theorem for empirical covariance operators. The proof is based on a novel contraction property, contrasting previous spectral perturbation approaches. This talk is based on joint works with Markus Reiß and Moritz Jirak.

Joint distribution of Busemann functions for the corner growth model

Series
Stochastics Seminar
Time
Thursday, February 28, 2019 - 15:05 for 1 hour (actually 50 minutes)
Location
Skiles 006
Speaker
Wai Tong (Louis) FanIndiana University, Bloomington
We present the joint distribution of the Busemann functions, in all directions of growth, of the exactly solvable corner growth model (CGM). This gives a natural coupling of all stationary CGMs and leads to new results about geodesics. Properties of this joint distribution are accessed by identifying it as the unique invariant distribution of a multiclass last passage percolation model. This is joint work with Timo Seppäläinen.

Wiener-Hopf Factorization for Markov Processes

Series
Stochastics Seminar
Time
Tuesday, February 26, 2019 - 15:05 for 1 hour (actually 50 minutes)
Location
Skiles 168
Speaker
R. GongIllinois Institute of Technology

Wiener-Hopf factorization (WHf) encompasses several important results in probability and stochastic processes, as well as in operator theory. The importance of the WHf stems not only from its theoretical appeal, manifested, in part, through probabilistic interpretation of analytical results, but also from its practical applications in a wide range of fields, such as fluctuation theory, insurance and finance. The various existing forms of the WHf for Markov chains, strong Markov processes, Levy processes, and Markov additive process, have been obtained only in the time-homogeneous case. However, there are abundant real life dynamical systems that are modeled in terms of time-inhomogenous processes, and yet the corresponding Wiener-Hopf factorization theory is not available for this important class of models. In this talk, I will first provide a survey on the development of Wiener-Hopf factorization for time-homogeneous Markov chains, Levy processes, and Markov additive processes. Then, I will discuss our recent work on WHf for time-inhomogensous Markov chains. To the best of our knowledge, this study is the first attempt to investigate the WHf for time-inhomogeneous Markov processes.

Stationary coalescing walks on the lattice

Series
Stochastics Seminar
Time
Thursday, February 21, 2019 - 15:05 for 1 hour (actually 50 minutes)
Location
Skiles 006
Speaker
Arjun KrishnanUniversity of Rochester
Consider a measurable dense family of semi-infinite nearest-neighbor paths on the integer lattice in d dimensions. If the measure on the paths is translation invariant, we completely classify their collective behavior in d=2 under mild assumptions. We use our theory to classify the behavior of families of semi-infinite geodesics in first- and last-passage percolation that come from Busemann functions. For d>=2, we describe the behavior of bi-infinite trajectories, and show that they carry an invariant measure. We also construct several examples displaying unexpected behavior. One of these examples lets us answer a question of C. Hoffman's from 2016. (joint work with Jon Chaika)

A tight net with respect to a random matrix norm and applications to estimating singular values

Series
Stochastics Seminar
Time
Thursday, February 14, 2019 - 15:05 for 1 hour (actually 50 minutes)
Location
Skiles 006
Speaker
G. LivshytsSOM, GaTech
In this talk we construct a net around the unit sphere with strong properties. We show that with exponentially high probability, the value of |Ax| on the sphere can be approximated well using this net, where A is a random matrix with independent columns. We apply it to study the smallest singular value of random matrices under very mild assumptions, and obtain sharp small ball behavior. As a partial case, we estimate (essentially optimally) the smallest singular value for matrices of arbitrary aspect ratio with i.i.d. mean zero variance one entries. Further, in the square case we show an estimate that holds only under simply the assumptions of independent entries with bounded concentration functions, and with appropriately bounded expected Hilbert-Schmidt norm. A key aspect of our results is the absence of structural requirements such as mean zero and equal variance of the entries.

Pages