Emergent Mind
Relative Deviation Learning Bounds and Generalization with Unbounded Loss Functions
(1310.5796)
Published Oct 22, 2013
in
cs.LG
Abstract
We present an extensive analysis of relative deviation bounds, including detailed proofs of two-sided inequalities and their implications. We also give detailed proofs of two-sided generalization bounds that hold in the general case of unbounded loss functions, under the assumption that a moment of the loss is bounded. These bounds are useful in the analysis of importance weighting and other learning tasks such as unbounded regression.
We're not able to analyze this paper right now due to high demand.
Please check back later (sorry!).
Generate a summary of this paper on our Pro plan:
We ran into a problem analyzing this paper.