Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
5 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

A lattice-based approach to the expressivity of deep ReLU neural networks (1902.11294v2)

Published 28 Feb 2019 in cs.LG, cs.IT, math.IT, and stat.ML

Abstract: We present new families of continuous piecewise linear (CPWL) functions in Rn having a number of affine pieces growing exponentially in $n$. We show that these functions can be seen as the high-dimensional generalization of the triangle wave function used by Telgarsky in 2016. We prove that they can be computed by ReLU networks with quadratic depth and linear width in the space dimension. We also investigate the approximation error of one of these functions by shallower networks and prove a separation result. The main difference between our functions and other constructions is their practical interest: they arise in the scope of channel coding. Hence, computing such functions amounts to performing a decoding operation.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (4)
  1. Vincent Corlay (28 papers)
  2. Joseph J. Boutros (20 papers)
  3. Philippe Ciblat (25 papers)
  4. Loic Brunel (3 papers)
Citations (3)

Summary

We haven't generated a summary for this paper yet.