Emergent Mind

Deep ReLU network approximation of functions on a manifold

(1908.00695)
Published Aug 2, 2019 in stat.ML and cs.LG

Abstract

Whereas recovery of the manifold from data is a well-studied topic, approximation rates for functions defined on manifolds are less known. In this work, we study a regression problem with inputs on a $d*$-dimensional manifold that is embedded into a space with potentially much larger ambient dimension. It is shown that sparsely connected deep ReLU networks can approximate a H\"older function with smoothness index $\beta$ up to error $\epsilon$ using of the order of $\epsilon{-d*/\beta}\log(1/\epsilon)$ many non-zero network parameters. As an application, we derive statistical convergence rates for the estimator minimizing the empirical risk over all possible choices of bounded network parameters.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.