Emergent Mind

Approximation of Functions on Manifolds in High Dimension from Noisy Scattered Data

(2012.13804)
Published Dec 26, 2020 in math.NA and cs.NA

Abstract

In this paper, we consider the fundamental problem of approximation of functions on a low-dimensional manifold embedded in a high-dimensional space, with noise affecting both in the data and values of the functions. Due to the curse of dimensionality, as well as to the presence of noise, the classical approximation methods applicable in low dimensions are less effective in the high-dimensional case. We propose a new approximation method that leverages the advantages of the Manifold Locally Optimal Projection (MLOP) method (introduced by Faigenbaum-Golovin and Levin in 2020) and the strengths of the method of Radial Basis Functions (RBF). The method is parametrization free, requires no knowledge regarding the manifold's intrinsic dimension, can handle noise and outliers in both the function values and in the location of the data, and is applied directly in the high dimensions. We show that the complexity of the method is linear in the dimension of the manifold and squared-logarithmic in the dimension of the codomain of the function. Subsequently, we demonstrate the effectiveness of our approach by considering different manifold topologies and show the robustness of the method to various noise levels.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.