Seminars and Colloquia by Series

Wednesday, October 30, 2019 - 15:00 , Location: Skiles 006 , Yair Shenfeld , Princeton University , yairs@princeton.edu , Organizer: Galyna Livshyts
Wednesday, October 30, 2019 - 15:00 , Location: Skiles 006 , Yair Shenfeld , Princeton University , yairs@princeton.edu , Organizer: Galyna Livshyts
Thursday, October 17, 2019 - 15:00 , Location: Skiles 006 , Samantha Petti , Georgia Tech , spetti@gatech.edu , Organizer: Galyna Livshyts

TBA

Wednesday, October 2, 2019 - 15:00 , Location: Skiles 006 , Masha Gordina , University of Connecticut , maria.gordina@uconn.edu , Organizer: Galyna Livshyts
Wednesday, April 17, 2019 - 15:00 , Location: Skiles 006 , Galyna Livshyts , Georgia Tech , glivshyts6@math.gatech.edu , Organizer: Galyna Livshyts

We discuss the asymptotic value of the maximal perimeter of a convex set in an n-dimensional space with respect to certain classes of measures. Firstly, we derive a lower bound for this quantity for a large class of probability distributions; the lower bound depends on the moments only. This lower bound is sharp in the case of the Gaussian measure (as was shown by Nazarov in 2001), and, more generally, in the case of rotation invariant log-concave measures (as was shown by myself in 2014). We discuss another class of measures for which this bound is sharp. For isotropic log-concave measures, the value of the lower bound is at least n^{1/8}.

In addition, we show a uniform upper bound of Cn||f||^{1/n}_{\infty} for all log-concave measures in a special position, which is attained for the uniform distribution on the cube. We further estimate the maximal perimeter of isotropic log-concave measures by n^2. 

Wednesday, April 10, 2019 - 15:00 , Location: Skiles 006 , Vladimir Koltchinskii , Georgia Tech , vladimir.koltchinskii@math.gatech.edu , Organizer: Galyna Livshyts

We discuss a general approach to a problem of estimation of a smooth function $f(\theta)$ of a high-dimensional parameter $\theta$
of statistical models. In particular, in the case of $n$ i.i.d. Gaussian observations $X_1,\doot, X_n$ with mean $\mu$ and covariance
matrix $\Sigma,$ the unknown parameter is $\theta = (\mu, \Sigma)$ and our approach yields an estimator of $f(\theta)$
for a function $f$ of smoothness $s>0$ with mean squared error of the order $(\frac{1}{n} \vee (\frac{d}{n})^s) \wedge 1$
(provided that the Euclidean norm of $\mu$ and operator norms of $\Sigma,\Sigma^{-1}$ are uniformly bounded),
with the error rate being minimax optimal up to a log factor (joint result with Mayya Zhilova). The construction of optimal estimators
crucially relies on a new bias reduction method in high-dimensional problems
and the bounds on the mean squared error are based on controlling finite differences of smooth functions along certain Markov chains
in high-dimensional parameter spaces as well as on concentration inequalities.

Wednesday, April 3, 2019 - 15:00 , Location: Skiles 006 , Sean O'Rourke , University of Colorado Boulder , sean.d.orourke@colorado.edu , Organizer: Konstantin Tikhomirov

Computing the eigenvalues and eigenvectors of a large matrix is a basic task in high dimensional data analysis with many applications in computer science and statistics. In practice, however, data is often perturbed by noise. A natural question is the following: How much does a small perturbation to the matrix change the eigenvalues and eigenvectors? In this talk, I will consider the case where the perturbation is random. I will discuss perturbation results for the eigenvalues and eigenvectors as well as for the singular values and singular vectors.  This talk is based on joint work with Van Vu, Ke Wang, and Philip Matchett Wood.

Wednesday, March 27, 2019 - 15:00 , Location: Skiles 006 , Liza Rebrova , UCLA , rebrova@math.ucla.edu , Organizer: Galyna Livshyts

One of the most famous methods for solving large-scale over-determined linear systems is Kaczmarz algorithm, which iteratively projects the previous approximation x_k onto the solution spaces of the next equation in the system. An elegant proof of the exponential convergence of this method using correct randomization of the process is due to Strohmer and Vershynin (2009). Many extensions and generalizations of the method were proposed since then, including the works of Needell, Tropp, Ward, Srebro, Tan and many others. An interesting unifying view on a number of iterative solvers (including several versions of the Kaczmarz algorithm) was proposed by Gower and Richtarik in 2016. The main idea of their sketch-and-project framework is the following: one can observe that the random selection of a row (or a row block) can be represented as a sketch, that is, left multiplication by a random vector (or a matrix), thereby pre-processing every iteration of the method, which is represented by a projection onto the image of the sketch.
I will give an overview of some of these methods, and talk about the role that random matrix theory plays in the showing their convergence. I will also discuss our new results with Deanna Needell on the block Gaussian sketch and project method.

 

Wednesday, March 13, 2019 - 15:00 , Location: Skiles 006 , Hoi Nguyen , Ohio State University , nguyen.1261@math.osu.edu , Organizer: Konstantin Tikhomirov

We will try to address a few universality questions for the behavior of large random matrices over finite fields, and then present a minimal progress on one of these questions.

Wednesday, February 27, 2019 - 15:00 , Location: Skiles 006 , Anna Skripka , University of New mexico , skripka@math.umn.edu , Organizer: Galyna Livshyts

Linear Schur multipliers, which act on matrices by entrywisemultiplications, as well as their generalizations have been studiedfor over a century and successfully applied in perturbation theory (asdemonstrated in the previous talk). In this talk, we will discussestimates for finite dimensional multilinear Schur multipliersunderlying these applications.

Pages