Emergent Mind
Error bounds for deep ReLU networks using the Kolmogorov--Arnold superposition theorem
(1906.11945)
Published Jun 27, 2019
in
math.NA
,
cs.LG
,
and
cs.NA
Abstract
We prove a theorem concerning the approximation of multivariate functions by deep ReLU networks, for which the curse of the dimensionality is lessened. Our theorem is based on a constructive proof of the Kolmogorov--Arnold superposition theorem, and on a subset of multivariate continuous functions whose outer superposition functions can be efficiently approximated by deep ReLU networks.
We're not able to analyze this paper right now due to high demand.
Please check back later (sorry!).
Generate a summary of this paper on our Pro plan:
We ran into a problem analyzing this paper.