Emergent Mind

Sharp moment-entropy inequalities and capacity bounds for log-concave distributions

(1811.00345)
Published Nov 1, 2018 in cs.IT , math.IT , and math.PR

Abstract

We show that the uniform distribution minimizes entropy among all one-dimensional symmetric log-concave distributions with fixed variance, as well as various generalizations of this fact to R\'enyi entropies of orders less than 1 and with moment constraints involving $p$-th absolute moments with $p\leq 2$. As consequences, we give new capacity bounds for additive noise channels with symmetric log-concave noises, as well as for timing channels involving positive signal and noise where the noise has a decreasing log-concave density. In particular, we show that the capacity of an additive noise channel with symmetric, log-concave noise under an average power constraint is at most 0.254 bits per channel use greater than the capacity of an additive Gaussian noise channel with the same noise power. Consequences for reverse entropy power inequalities and connections to the slicing problem in convex geometry are also discussed.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.