Seminars and Colloquia by Series

Reducing Isotropy to KLS: An n^3\psi^2 Volume Algorithm

Series
High Dimensional Seminar
Time
Wednesday, September 16, 2020 - 15:15 for 1 hour (actually 50 minutes)
Location
https://us02web.zoom.us/j/88203571169
Speaker
Santosh VempalaGeorgia Tech

Please Note: The preceding talk will be given on Tuesday September 15 at 10:30 am via https://technion.zoom.us/j/99202255210. More info here: http://people.math.gatech.edu/~glivshyts6/AGAonline.html

In this follow-up talk to the talk at the AGA seminar, we will discuss some aspects of a new algorithm for rounding and volume computation, including its proof, an efficient implementation for polytopes and open questions. We will begin the talk with a recap of the algorithm. 

Joint work with He Jia, Aditi Laddha and Yin Tat Lee. 

Link to attend: https://us02web.zoom.us/j/88203571169

Singularity of sparse Bernoulli matrices with$ p$ is close to $\log(n)/n$.

Series
High Dimensional Seminar
Time
Wednesday, September 9, 2020 - 15:00 for 1 hour (actually 50 minutes)
Location
Join Zoom Meeting https://us02web.zoom.us/j/88203571169 Meeting ID: 882 0357 1169
Speaker
Han HuangGeorgia Tech

It has been conjectured that for a sufficiently large $n$, and $p = p_n \in [\log(n)/n, 1/2)$, the probability that a $n\times n$ Bernoulli($p$) matrix $A$ is singular equals to the probability that $A$ contains of a zero row or zero column up to a negligible error.

This conjecture has been recently proved by Litvak-Tikhomirov in the regime $ C\log(n)/ n < p < 1/C$ for some universal constant $C>1$ with their new tool. While for $p = (1+o(1)) \log(n) /n$, it also holds due to a result of Basak-Rudelson. In this talk, we will discuss how to extend their results to fill the gap between these two regions. ( $1\le pn/\log(n) <\infty$ )

Robustly Clustering a Mixture of Gaussians

Series
High Dimensional Seminar
Time
Wednesday, March 11, 2020 - 15:00 for 1 hour (actually 50 minutes)
Location
Skiles 006
Speaker
Santosh Vempala Georgia Tech

Please Note: We give an efficient algorithm for robustly clustering of a mixture of two arbitrary Gaussians, a central open problem in the theory of computationally efficient robust estimation, assuming only that the the means of the component Gaussians are well-separated or their covariances are well-separated. Our algorithm and analysis extend naturally to robustly clustering mixtures of well-separated logconcave distributions. The mean separation required is close to the smallest possible to guarantee that most of the measure of the component Gaussians can be separated by some hyperplane (for covariances, it is the same condition in the second degree polynomial kernel). Our main tools are a new identifiability criterion based on isotropic position, and a corresponding Sum-of-Squares convex programming relaxation. This is joint work with He Jia.

The A. D. Aleksandrov problem of existence of convex hypersurfaces in Space with given Integral Gaussian curvature and optimal transport on the sphere

Series
High Dimensional Seminar
Time
Wednesday, March 4, 2020 - 15:00 for 1 hour (actually 50 minutes)
Location
Speaker
Vladimir OlikerEmory University

In his book Convex Polyhedra, ch. 7 (end of subsection 2) A.D. Aleksandrov raised a general question of finding variational statements and proofs of existence of convex polytopes with given geometric data. As an example of a geometric problem in which variational approach was successfully applied, Aleksandrov quotes the Minkowski problem. He also mentions the Weyl problem of isometric embedding for which a variational approach was proposed (but not fully developed and not completed) by W. Blashke and G. Herglotz. The first goal of this talk is to give a variational formulation and solution to the problem of existence and uniqueness of a closed convex hypersurface in Euclidean space with prescribed integral Gaussian curvature (also posed by Aleksandrov who solved it using topological methods). The second goal of this talk is to show that in variational form the Aleksandrov problem is closely connected to the theory of optimal mass transport on a sphere with cost function and constraints arising naturally from geometric considerations.

Gaussian methods in randomized Dvoretzky theorem

Series
High Dimensional Seminar
Time
Wednesday, February 26, 2020 - 15:00 for 1 hour (actually 50 minutes)
Location
Skiles 006
Speaker
Petros ValettasUniversity of Missouri, Columbia

The cornerstone in local theory of Banach spaces is Dvoretzky’s theorem, which asserts that almost euclidean structure is locally present in any high-dimensional normed space. The random version of this remarkable phenomenon was put forth by V. Milman in 70’s, who employed the concentration of measure on the sphere. Purpose of the talk is to present how Gaussian tools from high-dimensional probability (e.g., Gaussian convexity, hypercontractivity, superconcentration) can be exploited for obtaining optimal results in random forms of Dvoretzky’s theorem. Based on joint work(s) with Grigoris Paouris and Konstantin Tikhomirov.

Pages