- Series
- High Dimensional Seminar
- Time
- Wednesday, April 10, 2019 - 3:00pm for 1 hour (actually 50 minutes)
- Location
- Skiles 006
- Speaker
- Vladimir Koltchinskii – Georgia Tech – vladimir.koltchinskii@math.gatech.edu
- Organizer
- Galyna Livshyts

**Please Note:** We discuss a general approach to a problem of estimation of a smooth function $f(\theta)$ of a high-dimensional parameter $\theta$
of statistical models. In particular, in the case of $n$ i.i.d. Gaussian observations $X_1,\doot, X_n$ with mean $\mu$ and covariance
matrix $\Sigma,$ the unknown parameter is $\theta = (\mu, \Sigma)$ and our approach yields an estimator of $f(\theta)$
for a function $f$ of smoothness $s>0$ with mean squared error of the order $(\frac{1}{n} \vee (\frac{d}{n})^s) \wedge 1$
(provided that the Euclidean norm of $\mu$ and operator norms of $\Sigma,\Sigma^{-1}$ are uniformly bounded),
with the error rate being minimax optimal up to a log factor (joint result with Mayya Zhilova). The construction of optimal estimators
crucially relies on a new bias reduction method in high-dimensional problems
and the bounds on the mean squared error are based on controlling finite differences of smooth functions along certain Markov chains
in high-dimensional parameter spaces as well as on concentration inequalities.