Emergent Mind

Minimum Width of Leaky-ReLU Neural Networks for Uniform Universal Approximation

(2305.18460)
Published May 29, 2023 in cs.LG , cs.NA , and math.NA

Abstract

The study of universal approximation properties (UAP) for neural networks (NN) has a long history. When the network width is unlimited, only a single hidden layer is sufficient for UAP. In contrast, when the depth is unlimited, the width for UAP needs to be not less than the critical width $w*{\min}=\max(dx,d_y)$, where $dx$ and $dy$ are the dimensions of the input and output, respectively. Recently, \cite{cai2022achieve} shows that a leaky-ReLU NN with this critical width can achieve UAP for $Lp$ functions on a compact domain ${K}$, \emph{i.e.,} the UAP for $Lp({K},\mathbb{R}{d_y})$. This paper examines a uniform UAP for the function class $C({K},\mathbb{R}{d_y})$ and gives the exact minimum width of the leaky-ReLU NN as $w{\min}=\max(dx,dy)+\Delta (dx, dy)$, where $\Delta (dx, d_y)$ is the additional dimensions for approximating continuous functions with diffeomorphisms via embedding. To obtain this result, we propose a novel lift-flow-discretization approach that shows that the uniform UAP has a deep connection with topological theory.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.