Emergent Mind

Strengthened Information-theoretic Bounds on the Generalization Error

(1903.03787)
Published Mar 9, 2019 in cs.IT and math.IT

Abstract

The following problem is considered: given a joint distribution $P{XY}$ and an event $E$, bound $P{XY}(E)$ in terms of $PXPY(E)$ (where $PXPY$ is the product of the marginals of $P{XY}$) and a measure of dependence of $X$ and $Y$. Such bounds have direct applications in the analysis of the generalization error of learning algorithms, where $E$ represents a large error event and the measure of dependence controls the degree of overfitting. Herein, bounds are demonstrated using several information-theoretic metrics, in particular: mutual information, lautum information, maximal leakage, and $J\infty$. The mutual information bound can outperform comparable bounds in the literature by an arbitrarily large factor.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.