Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
5 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

On the generalization of bayesian deep nets for multi-class classification (2002.09866v1)

Published 23 Feb 2020 in cs.LG and stat.ML

Abstract: Generalization bounds which assess the difference between the true risk and the empirical risk have been studied extensively. However, to obtain bounds, current techniques use strict assumptions such as a uniformly bounded or a Lipschitz loss function. To avoid these assumptions, in this paper, we propose a new generalization bound for Bayesian deep nets by exploiting the contractivity of the Log-Sobolev inequalities. Using these inequalities adds an additional loss-gradient norm term to the generalization bound, which is intuitively a surrogate of the model complexity. Empirically, we analyze the affect of this loss-gradient norm term using different deep nets.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (4)
  1. Yossi Adi (96 papers)
  2. Yaniv Nemcovsky (6 papers)
  3. Alex Schwing (13 papers)
  4. Tamir Hazan (39 papers)
Citations (1)

Summary

We haven't generated a summary for this paper yet.