Emergent Mind

Abstract

Besides standard Lagrange interpolation, i.e., interpolation of target functions from scattered point evaluations, positive definite kernel functions are well-suited for the solution of more general reconstruction problems. This is due to the intrinsic structure of the underlying reproducing kernel Hilbert space (RKHS). In fact, kernel-based interpolation has been applied to the reconstruction of bivariate functions from scattered Radon samples in computerized tomography (cf. Iske, 2018) and, moreover, to the numerical solution of elliptic PDEs (cf. Wenzel et al., 2022). As shown in various previous contributions, numerical algorithms and theoretical results from kernel-based Lagrange interpolation can be transferred to more general interpolation problems. In particular, greedy point selection methods were studied in (Wenzel et al., 2022), for the special case of Sobolev kernels. In this paper, we aim to develop and analyze more general kernel-based interpolation methods, for less restrictive settings. To this end, we first provide convergence results for generalized interpolation under minimalistic assumptions on both the selected kernel and the target function. Finally, we prove convergence of popular greedy data selection algorithms for totally bounded sets of functionals. Supporting numerical results are provided for illustration.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.