Emergent Mind

Worst-case recovery guarantees for least squares approximation using random samples

(1911.10111)
Published Nov 22, 2019 in math.NA , cs.NA , and math.FA

Abstract

We construct a least squares approximation method for the recovery of complex-valued functions from a reproducing kernel Hilbert space on $D \subset \mathbb{R}d$. The nodes are drawn at random for the whole class of functions and the error is measured in $L2(D,\varrhoD)$. We prove worst-case recovery guarantees by explicitly controlling all the involved constants. This leads to new preasymptotic recovery bounds with high probability for the error of Hyperbolic Fourier Regression on multivariate data. In addition, we further investigate its counterpart Hyperbolic Wavelet Regression also based on least-squares to recover non-periodic functions from random samples. Finally, we reconsider the analysis of a cubature method based on plain random points with optimal weights and reveal near-optimal worst-case error bounds with high probability. It turns out that this simple method can compete with the quasi-Monte Carlo methods in the literature which are based on lattices and digital nets.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.