Rényi entropy and variance comparison for symmetric log-concave random variables
(2108.10100)Abstract
We show that for any $\alpha>0$ the R\'enyi entropy of order $\alpha$ is minimized, among all symmetric log-concave random variables with fixed variance, either for a uniform distribution or for a two sided exponential distribution. The first case occurs for $\alpha \in (0,\alpha*]$ and the second case for $\alpha \in [\alpha*,\infty)$, where $\alpha*$ satisfies the equation $\frac{1}{\alpha*-1}\log \alpha*= \frac12 \log 6$, that is $\alpha* \approx 1.241$. Using those results, we prove that one-sided exponential distribution minimizes R\'enyi entropy of order $\alpha \geq 2$ among all log-concave random variables with fixed variance.
We're not able to analyze this paper right now due to high demand.
Please check back later (sorry!).
Generate a summary of this paper on our Pro plan:
We ran into a problem analyzing this paper.