A sharp upper bound for sampling numbers in $L_{2}$ (2204.12621v2)
Abstract: For a class $F$ of complex-valued functions on a set $D$, we denote by $g_n(F)$ its sampling numbers, i.e., the minimal worst-case error on $F$, measured in $L_2$, that can be achieved with a recovery algorithm based on $n$ function evaluations. We prove that there is a universal constant $c\in\mathbb{N}$ such that, if $F$ is the unit ball of a separable reproducing kernel Hilbert space, then [ g_{cn}(F)2 \,\le\, \frac{1}{n}\sum_{k\geq n} d_k(F)2, ] where $d_k(F)$ are the Kolmogorov widths (or approximation numbers) of $F$ in $L_2$. We also obtain similar upper bounds for more general classes $F$, including all compact subsets of the space of continuous functions on a bounded domain $D\subset \mathbb{R}d$, and show that these bounds are sharp by providing examples where the converse inequality holds up to a constant. The results rely on the solution to the Kadison-Singer problem, which we extend to the subsampling of a sum of infinite rank-one matrices.
Collections
Sign up for free to add this paper to one or more collections.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.