Emergent Mind

Approximate Discrete Entropy Monotonicity for Log-Concave Sums

(2210.06624)
Published Oct 12, 2022 in math.PR , cs.IT , math.CO , and math.IT

Abstract

It is proven that a conjecture of Tao (2010) holds true for log-concave random variables on the integers: For every $n \geq 1$, if $X1,\ldots,Xn$ are i.i.d. integer-valued, log-concave random variables, then $$ H(X1+\cdots+X{n+1}) \geq H(X1+\cdots+X{n}) + \frac{1}{2}\log{\Bigl(\frac{n+1}{n}\Bigr)} - o(1) $$ as $H(X1) \to \infty$, where $H$ denotes the (discrete) Shannon entropy. The problem is reduced to the continuous setting by showing that if $U1,\ldots,Un$ are independent continuous uniforms on $(0,1)$, then $$ h(X1+\cdots+Xn + U1+\cdots+Un) = H(X1+\cdots+Xn) + o(1) $$ as $H(X1) \to \infty$, where $h$ stands for the differential entropy. Explicit bounds for the $o(1)$-terms are provided.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.