Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
5 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Learning functions varying along a central subspace (2001.07883v3)

Published 22 Jan 2020 in math.ST, stat.ML, and stat.TH

Abstract: Many functions of interest are in a high-dimensional space but exhibit low-dimensional structures. This paper studies regression of a $s$-H\"{o}lder function $f$ in $\mathbb{R}D$ which varies along a central subspace of dimension $d$ while $d\ll D$. A direct approximation of $f$ in $\mathbb{R}D$ with an $\varepsilon$ accuracy requires the number of samples $n$ in the order of $\varepsilon{-(2s+D)/s}$. In this paper, we analyze the Generalized Contour Regression (GCR) algorithm for the estimation of the central subspace and use piecewise polynomials for function approximation. GCR is among the best estimators for the central subspace, but its sample complexity is an open question. We prove that GCR leads to a mean squared estimation error of $O(n{-1})$ for the central subspace, if a variance quantity is exactly known. The estimation error of this variance quantity is also given in this paper. The mean squared regression error of $f$ is proved to be in the order of $\left(n/\log n\right){-\frac{2s}{2s+d}}$ where the exponent depends on the dimension of the central subspace $d$ instead of the ambient space $D$. This result demonstrates that GCR is effective in learning the low-dimensional central subspace. We also propose a modified GCR with improved efficiency. The convergence rate is validated through several numerical experiments.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (2)
  1. Hao Liu (497 papers)
  2. Wenjing Liao (42 papers)
Citations (3)

Summary

We haven't generated a summary for this paper yet.