Emergent Mind

Exponential ReLU Neural Network Approximation Rates for Point and Edge Singularities

(2010.12217)
Published Oct 23, 2020 in math.NA , cs.LG , and cs.NA

Abstract

We prove exponential expressivity with stable ReLU Neural Networks (ReLU NNs) in $H1(\Omega)$ for weighted analytic function classes in certain polytopal domains $\Omega$, in space dimension $d=2,3$. Functions in these classes are locally analytic on open subdomains $D\subset \Omega$, but may exhibit isolated point singularities in the interior of $\Omega$ or corner and edge singularities at the boundary $\partial \Omega$. The exponential expression rate bounds proved here imply uniform exponential expressivity by ReLU NNs of solution families for several elliptic boundary and eigenvalue problems with analytic data. The exponential approximation rates are shown to hold in space dimension $d = 2$ on Lipschitz polygons with straight sides, and in space dimension $d=3$ on Fichera-type polyhedral domains with plane faces. The constructive proofs indicate in particular that NN depth and size increase poly-logarithmically with respect to the target NN approximation accuracy $\varepsilon>0$ in $H1(\Omega)$. The results cover in particular solution sets of linear, second order elliptic PDEs with analytic data and certain nonlinear elliptic eigenvalue problems with analytic nonlinearities and singular, weighted analytic potentials as arise in electron structure models. In the latter case, the functions correspond to electron densities that exhibit isolated point singularities at the positions of the nuclei. Our findings provide in particular mathematical foundation of recently reported, successful uses of deep neural networks in variational electron structure algorithms.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.