Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
5 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Semialgebraic Optimization for Lipschitz Constants of ReLU Networks (2002.03657v4)

Published 10 Feb 2020 in math.OC and cs.LG

Abstract: The Lipschitz constant of a network plays an important role in many applications of deep learning, such as robustness certification and Wasserstein Generative Adversarial Network. We introduce a semidefinite programming hierarchy to estimate the global and local Lipschitz constant of a multiple layer deep neural network. The novelty is to combine a polynomial lifting for ReLU functions derivatives with a weak generalization of Putinar's positivity certificate. This idea could also apply to other, nearly sparse, polynomial optimization problems in machine learning. We empirically demonstrate that our method provides a trade-off with respect to state of the art linear programming approach, and in some cases we obtain better bounds in less time.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (4)
  1. Tong Chen (200 papers)
  2. Jean-Bernard Lasserre (71 papers)
  3. Victor Magron (92 papers)
  4. Edouard Pauwels (50 papers)
Citations (3)

Summary

We haven't generated a summary for this paper yet.