Emergent Mind

Deep Learning in High Dimension: Neural Network Approximation of Analytic Functions in $L^2(\mathbb{R}^d,γ_d)$

(2111.07080)
Published Nov 13, 2021 in math.NA , cs.NA , math.PR , and stat.ML

Abstract

For artificial deep neural networks, we prove expression rates for analytic functions $f:\mathbb{R}d\to\mathbb{R}$ in the norm of $L2(\mathbb{R}d,\gamma_d)$ where $d\in {\mathbb{N}}\cup{ \infty }$. Here $\gammad$ denotes the Gaussian product probability measure on $\mathbb{R}d$. We consider in particular ReLU and ReLU${}k$ activations for integer $k\geq 2$. For $d\in\mathbb{N}$, we show exponential convergence rates in $L2(\mathbb{R}d,\gammad)$. In case $d=\infty$, under suitable smoothness and sparsity assumptions on $f:\mathbb{R}{\mathbb{N}}\to\mathbb{R}$, with $\gamma\infty$ denoting an infinite (Gaussian) product measure on $\mathbb{R}{\mathbb{N}}$, we prove dimension-independent expression rate bounds in the norm of $L2(\mathbb{R}{\mathbb{N}},\gamma\infty)$. The rates only depend on quantified holomorphy of (an analytic continuation of) the map $f$ to a product of strips in $\mathbb{C}d$. As an application, we prove expression rate bounds of deep ReLU-NNs for response surfaces of elliptic PDEs with log-Gaussian random field inputs.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.