Papers
Topics
Authors
Recent
2000 character limit reached

Two Measures of Dependence (1607.02330v4)

Published 8 Jul 2016 in cs.IT and math.IT

Abstract: Two families of dependence measures between random variables are introduced. They are based on the R\'enyi divergence of order $\alpha$ and the relative $\alpha$-entropy, respectively, and both dependence measures reduce to Shannon's mutual information when their order $\alpha$ is one. The first measure shares many properties with the mutual information, including the data-processing inequality, and can be related to the optimal error exponents in composite hypothesis testing. The second measure does not satisfy the data-processing inequality, but appears naturally in the context of distributed task encoding.

Citations (35)

Summary

We haven't generated a summary for this paper yet.

Slide Deck Streamline Icon: https://streamlinehq.com

Whiteboard

Dice Question Streamline Icon: https://streamlinehq.com

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Lightbulb Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.