Emergent Mind

Information-Theoretic Perspectives on Brascamp-Lieb Inequality and Its Reverse

(1702.06260)
Published Feb 21, 2017 in cs.IT and math.IT

Abstract

We introduce an inequality which may be viewed as a generalization of both the Brascamp-Lieb inequality and its reverse (Barthe's inequality), and prove its information-theoretic (i.e.\ entropic) formulation. This result leads to a unified approach to functional inequalities such as the variational formula of R\'enyi entropy, hypercontractivity and its reverse, strong data processing inequalities, and transportation-cost inequalities, whose utility in the proofs of various coding theorems has gained growing popularity recently. We show that our information-theoretic setting is convenient for proving properties such as data processing, tensorization, convexity (Riesz-Thorin interpolation) and Gaussian optimality. In particular, we elaborate on a "doubling trick" used by Lieb and Geng-Nair to prove several results on Gaussian optimality. Several applications are discussed, including a generalization of the Brascamp-Lieb inequality involving Gaussian random transformations, the determination of Wyner's common information of vector Gaussian sources, and the achievable rate region of certain key generation problems in the case of vector Gaussian sources.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.