Emergent Mind

Abstract

We generalize the family of $\alpha$-divergences using a pair of strictly comparable weighted means. In particular, we obtain the $1$-divergence in the limit case $\alpha\rightarrow 1$ (a generalization of the Kullback-Leibler divergence) and the $0$-divergence in the limit case $\alpha\rightarrow 0$ (a generalization of the reverse Kullback-Leibler divergence). We state the condition for a pair of quasi-arithmetic means to be strictly comparable, and report the formula for the quasi-arithmetic $\alpha$-divergences and its subfamily of bipower homogeneous $\alpha$-divergences which belong to the Csis\'ar's $f$-divergences. Finally, we show that these generalized quasi-arithmetic $1$-divergences and $0$-divergences can be decomposed as the sum of generalized cross-entropies minus entropies, and rewritten as conformal Bregman divergences using monotone embeddings.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.