Deep Learning in High Dimension: Neural Network Approximation of Analytic Functions in $L^2(\mathbb{R}^d,γ_d)$ (2111.07080v1)
Abstract: For artificial deep neural networks, we prove expression rates for analytic functions $f:\mathbb{R}d\to\mathbb{R}$ in the norm of $L2(\mathbb{R}d,\gamma_d)$ where $d\in {\mathbb{N}}\cup{ \infty }$. Here $\gamma_d$ denotes the Gaussian product probability measure on $\mathbb{R}d$. We consider in particular ReLU and ReLU${}k$ activations for integer $k\geq 2$. For $d\in\mathbb{N}$, we show exponential convergence rates in $L2(\mathbb{R}d,\gamma_d)$. In case $d=\infty$, under suitable smoothness and sparsity assumptions on $f:\mathbb{R}{\mathbb{N}}\to\mathbb{R}$, with $\gamma_\infty$ denoting an infinite (Gaussian) product measure on $\mathbb{R}{\mathbb{N}}$, we prove dimension-independent expression rate bounds in the norm of $L2(\mathbb{R}{\mathbb{N}},\gamma_\infty)$. The rates only depend on quantified holomorphy of (an analytic continuation of) the map $f$ to a product of strips in $\mathbb{C}d$. As an application, we prove expression rate bounds of deep ReLU-NNs for response surfaces of elliptic PDEs with log-Gaussian random field inputs.
Collections
Sign up for free to add this paper to one or more collections.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.