Emergent Mind

Weighted sampling recovery of functions with mixed smoothness

(2405.16400)
Published May 26, 2024 in math.NA and cs.NA

Abstract

We study sparse-grid linear sampling algorithms and their optimality for approximate recovery of functions with mixed smoothness on $\mathbb{R}d$ from a set of $n$ their sampled values in two different settings: (i) functions to be recovered are in weighted Sobolev spaces $Wr_{p,w}(\mathbb{R}d)$ of mixed smoothness and the approximation error is measured by the norm of the weighted Lebesgue space $L{q,w}(\mathbb{R}d)$, and (ii) functions to be recovered are in Sobolev spaces with measure $Wrp(\mathbb{R}d; \muw)$ of mixed smoothness and the approximation error is measured by the norm of the Lebesgue space with measure $Lq(\mathbb{R}d; \muw)$. Here, the function $w$, a tensor-product Freud-type weight is the weight in the setting (i), and the density function of the measure $\muw$ in the setting (ii). The optimality of linear sampling algorithms is investigated in terms of the relevant sampling $n$-widths. We construct sparse-grid linear sampling algorithms which are completely different for the settings (i) and (ii) and which give upper bounds of the corresponding sampling $n$-widths. We prove that in the one-dimensional case, these algorithms realize the right convergence rate of the sampling widths. In the setting (ii) for the high dimensional case ($d\ge 2$), we also achieve the right convergence rate of the sampling $n$-widths for $1\le q \le 2 \le p \le \infty$ through a non-constructive method.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.