Emergent Mind

$L^p$ sampling numbers for the Fourier-analytic Barron space

(2208.07605)
Published Aug 16, 2022 in math.FA , cs.LG , and stat.ML

Abstract

In this paper, we consider Barron functions $f : [0,1]d \to \mathbb{R}$ of smoothness $\sigma > 0$, which are functions that can be written as [ f(x) = \int{\mathbb{R}d} F(\xi) \, e{2 \pi i \langle x, \xi \rangle} \, d \xi \quad \text{with} \quad \int{\mathbb{R}d} |F(\xi)| \cdot (1 + |\xi|){\sigma} \, d \xi < \infty. ] For $\sigma = 1$, these functions play a prominent role in machine learning, since they can be efficiently approximated by (shallow) neural networks without suffering from the curse of dimensionality. For these functions, we study the following question: Given $m$ point samples $f(x1),\dots,f(xm)$ of an unknown Barron function $f : [0,1]d \to \mathbb{R}$ of smoothness $\sigma$, how well can $f$ be recovered from these samples, for an optimal choice of the sampling points and the reconstruction procedure? Denoting the optimal reconstruction error measured in $Lp$ by $sm (\sigma; Lp)$, we show that [ m{- \frac{1}{\max { p,2 }} - \frac{\sigma}{d}} \lesssim sm(\sigma;Lp) \lesssim (\ln (e + m)){\alpha(\sigma,d) / p} \cdot m{- \frac{1}{\max { p,2 }} - \frac{\sigma}{d}} , ] where the implied constants only depend on $\sigma$ and $d$ and where $\alpha(\sigma,d)$ stays bounded as $d \to \infty$.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.