Emergent Mind

Two Measures of Dependence

(1607.02330)
Published Jul 8, 2016 in cs.IT and math.IT

Abstract

Two families of dependence measures between random variables are introduced. They are based on the R\'enyi divergence of order $\alpha$ and the relative $\alpha$-entropy, respectively, and both dependence measures reduce to Shannon's mutual information when their order $\alpha$ is one. The first measure shares many properties with the mutual information, including the data-processing inequality, and can be related to the optimal error exponents in composite hypothesis testing. The second measure does not satisfy the data-processing inequality, but appears naturally in the context of distributed task encoding.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.