Learning functions varying along an active subspace

SIAM Student Seminar
Friday, February 7, 2020 - 2:00pm for 1 hour (actually 50 minutes)
Skiles 005
Hao Liu – GT Math
Jiaqi Yang

Many functions of interest are in a high-dimensional space but exhibit low-dimensional structures. This work studies regression of a $s$-Hölder function $f$ in $\mathbb{R}^D$ which varies along an active subspace of dimension $d$ while $d\ll D$. A direct approximation of $f$ in $\mathbb{R}^D$ with an $\varepsilon$ accuracy requires the number of samples $n$ in the order of $\varepsilon^{-(2s+D)/s}$. In this work, we modify the Generalized Contour Regression (GCR) algorithm to estimate the active subspace and use piecewise polynomials for function approximation. GCR is among the best estimators for the active subspace, but its sample complexity is an open question. Our modified GCR improves the efficiency over the original GCR and leads to a mean squared estimation error of $O(n^{-1})$ for the active subspace, when $n$ is sufficiently large. The mean squared regression error of $f$ is proved to be in the order of $\left(n/\log n\right)^{-\frac{2s}{2s+d}}$, where the exponent depends on the dimension of the active subspace $d$ instead of the ambient space $D$. This result demonstrates that GCR is effective in learning low-dimensional active subspaces. The convergence rate is validated through several numerical experiments.

This is a joint work with Wenjing Liao.