Emergent Mind

Sharp Bounds Between Two Rényi Entropies of Distinct Positive Orders

(1605.00019)
Published Apr 29, 2016 in cs.IT and math.IT

Abstract

Many axiomatic definitions of entropy, such as the R\'enyi entropy, of a random variable are closely related to the $\ell{\alpha}$-norm of its probability distribution. This study considers probability distributions on finite sets, and examines the sharp bounds of the $\ell{\beta}$-norm with a fixed $\ell_{\alpha}$-norm, $\alpha \neq \beta$, for $n$-dimensional probability vectors with an integer $n \ge 2$. From the results, we derive the sharp bounds of the R\'enyi entropy of positive order $\beta$ with a fixed R\'enyi entropy of another positive order $\alpha$. As applications, we investigate sharp bounds of Ariomoto's mutual information of order $\alpha$ and Gallager's random coding exponents for uniformly focusing channels under the uniform input distribution.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.