Emergent Mind

A sharp upper bound for sampling numbers in $L_{2}$

(2204.12621)
Published Apr 26, 2022 in math.NA , cs.NA , and math.FA

Abstract

For a class $F$ of complex-valued functions on a set $D$, we denote by $gn(F)$ its sampling numbers, i.e., the minimal worst-case error on $F$, measured in $L2$, that can be achieved with a recovery algorithm based on $n$ function evaluations. We prove that there is a universal constant $c\in\mathbb{N}$ such that, if $F$ is the unit ball of a separable reproducing kernel Hilbert space, then [ g{cn}(F)2 \,\le\, \frac{1}{n}\sum{k\geq n} dk(F)2, ] where $dk(F)$ are the Kolmogorov widths (or approximation numbers) of $F$ in $L_2$. We also obtain similar upper bounds for more general classes $F$, including all compact subsets of the space of continuous functions on a bounded domain $D\subset \mathbb{R}d$, and show that these bounds are sharp by providing examples where the converse inequality holds up to a constant. The results rely on the solution to the Kadison-Singer problem, which we extend to the subsampling of a sum of infinite rank-one matrices.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.