School of Mathematics Colloquium
Thursday, February 12, 2015 - 11:00am
1 hour (actually 50 minutes)
Dvoretzky's theorem tells us that if we put an arbitrary norm on n-dimensional Euclidean space, no matter what that normed space is like, if we pass to subspaces of dimension about log(n), the space looks pretty much Euclidean. A related measure-theoretic phenomenon has long been observed:the (one-dimensional) marginals of many natural high-dimensional probability distributions look about Gaussian. A natural question is whether this phenomenon persists for k-dimensional marginals for k growing with n, and if so, for how large a k? In this talk I will discuss a result showing that the phenomenon does indeed persist if k less than 2log(n)/log(log(n)), and that this bound is sharp (even the 2!). The talk will not assume much background beyond basic probability and analysis; in particular, no prior knowledge of Dvoretzky's theorem is needed.