Emergent Mind

Strengthening the Entropy Power Inequality

(1602.03033)
Published Feb 9, 2016 in cs.IT and math.IT

Abstract

We tighten the Entropy Power Inequality (EPI) when one of the random summands is Gaussian. Our strengthening is closely connected to the concept of strong data processing for Gaussian channels and generalizes the (vector extension of) Costa's EPI. This leads to a new reverse entropy power inequality and, as a corollary, sharpens Stam's inequality relating entropy power and Fisher information. Applications to network information theory are given, including a short self-contained proof of the rate region for the two-encoder quadratic Gaussian source coding problem. Our argument is based on weak convergence and a technique employed by Geng and Nair for establishing Gaussian optimality via rotational-invariance, which traces its roots to a `doubling trick' that has been successfully used in the study of functional inequalities.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.