Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
157 tokens/sec
GPT-4o
43 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Complexity, Statistical Risk, and Metric Entropy of Deep Nets Using Total Path Variation (1902.00800v2)

Published 2 Feb 2019 in stat.ML and cs.LG

Abstract: For any ReLU network there is a representation in which the sum of the absolute values of the weights into each node is exactly $1$, and the input layer variables are multiplied by a value $V$ coinciding with the total variation of the path weights. Implications are given for Gaussian complexity, Rademacher complexity, statistical risk, and metric entropy, all of which are shown to be proportional to $V$. There is no dependence on the number of nodes per layer, except for the number of inputs $d$. For estimation with sub-Gaussian noise, the mean square generalization error bounds that can be obtained are of order $V \sqrt{L + \log d}/\sqrt{n}$, where $L$ is the number of layers and $n$ is the sample size.

Citations (29)

Summary

We haven't generated a summary for this paper yet.