- Series
- Research Horizons Seminar
- Time
- Wednesday, March 28, 2018 - 12:10pm for 1 hour (actually 50 minutes)
- Location
- Skiles 006
- Speaker
- Wenjing Liao – Georgia Tech – wliao60@gatech.edu – http://people.math.gatech.edu/~wliao60/
- Organizer
- Adrian Perez Bustamante
Many
data sets in image analysis and signal processing are in a
high-dimensional space
but exhibit a low-dimensional structure. We are interested in building
efficient representations of these data for the purpose of compression
and inference. In the setting where a data set in $R^D$ consists of
samples from a probability measure concentrated
on or near an unknown $d$-dimensional manifold with
$d$ much smaller than $D$, we consider
two sets of problems: low-dimensional geometric approximations to the
manifold and regression of a function on the manifold. In the first
case, we construct multiscale low-dimensional empirical approximations
to the manifold and give finite-sample performance
guarantees. In the second case, we exploit these empirical geometric
approximations of the manifold and construct multiscale approximations
to the function. We prove finite-sample guarantees showing that we
attain the same learning rates as if the function
was defined on a Euclidean domain of dimension $d$. In both cases our
approximations can adapt to the regularity of the manifold or the
function even when this varies at different scales or locations.