Sharp Bounds Between Two Rényi Entropies of Distinct Positive Orders (1605.00019v2)
Abstract: Many axiomatic definitions of entropy, such as the R\'enyi entropy, of a random variable are closely related to the $\ell_{\alpha}$-norm of its probability distribution. This study considers probability distributions on finite sets, and examines the sharp bounds of the $\ell_{\beta}$-norm with a fixed $\ell_{\alpha}$-norm, $\alpha \neq \beta$, for $n$-dimensional probability vectors with an integer $n \ge 2$. From the results, we derive the sharp bounds of the R\'enyi entropy of positive order $\beta$ with a fixed R\'enyi entropy of another positive order $\alpha$. As applications, we investigate sharp bounds of Ariomoto's mutual information of order $\alpha$ and Gallager's random coding exponents for uniformly focusing channels under the uniform input distribution.