Emergent Mind

Neural Network Approximation: Three Hidden Layers Are Enough

(2010.14075)
Published Oct 25, 2020 in cs.LG , cs.NE , and stat.ML

Abstract

A three-hidden-layer neural network with super approximation power is introduced. This network is built with the floor function ($\lfloor x\rfloor$), the exponential function ($2x$), the step function ($1{x\geq 0}$), or their compositions as the activation function in each neuron and hence we call such networks as Floor-Exponential-Step (FLES) networks. For any width hyper-parameter $N\in\mathbb{N}+$, it is shown that FLES networks with width $\max{d,N}$ and three hidden layers can uniformly approximate a H\"older continuous function $f$ on $[0,1]d$ with an exponential approximation rate $3\lambda (2\sqrt{d}){\alpha} 2{-\alpha N}$, where $\alpha \in(0,1]$ and $\lambda>0$ are the H\"older order and constant, respectively. More generally for an arbitrary continuous function $f$ on $[0,1]d$ with a modulus of continuity $\omegaf(\cdot)$, the constructive approximation rate is $2\omegaf(2\sqrt{d}){2{-N}}+\omegaf(2\sqrt{d}\,2{-N})$. Moreover, we extend such a result to general bounded continuous functions on a bounded set $E\subseteq\mathbb{R}d$. As a consequence, this new class of networks overcomes the curse of dimensionality in approximation power when the variation of $\omegaf(r)$ as $r\rightarrow 0$ is moderate (e.g., $\omegaf(r)\lesssim r\alpha$ for H\"older continuous functions), since the major term to be concerned in our approximation rate is essentially $\sqrt{d}$ times a function of $N$ independent of $d$ within the modulus of continuity. Finally, we extend our analysis to derive similar approximation results in the $Lp$-norm for $p\in[1,\infty)$ via replacing Floor-Exponential-Step activation functions by continuous activation functions.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.