Rényi Divergence and Majorization (1001.4448v3)
Abstract: R\'enyi divergence is related to R\'enyi entropy much like information divergence (also called Kullback-Leibler divergence or relative entropy) is related to Shannon's entropy, and comes up in many settings. It was introduced by R\'enyi as a measure of information that satisfies almost the same axioms as information divergence. We review the most important properties of R\'enyi divergence, including its relation to some other distances. We show how R\'enyi divergence appears when the theory of majorization is generalized from the finite to the continuous setting. Finally, R\'enyi divergence plays a role in analyzing the number of binary questions required to guess the values of a sequence of random variables.
Collections
Sign up for free to add this paper to one or more collections.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.