Emergent Mind

Properties of a Generalized Divergence Related to Tsallis Relative Entropy

(1810.09503)
Published Oct 22, 2018 in cs.IT and math.IT

Abstract

In this paper, we investigate the partition inequality, joint convexity, and Pinsker's inequality, for a divergence that generalizes the Tsallis Relative Entropy and Kullback-Leibler divergence. The generalized divergence is defined in terms of a deformed exponential function, which replaces the Tsallis $q$-exponential. We also constructed a family of probability distributions related to the generalized divergence. We found necessary and sufficient conditions for the partition inequality to be satisfied. A sufficient condition for the joint convexity was established. We proved that the generalized divergence satisfies the partition inequality, and is jointly convex, if, and only if, it coincides with the Tsallis relative entropy. As an application of partition inequality, a criterion for the Pinsker's inequality was found.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.