Emergent Mind

Derivatives of mutual information in Gaussian channels

(2303.02500)
Published Mar 4, 2023 in cs.IT and math.IT

Abstract

The I-MMSE formula connects two important quantities in information theory and estimation theory. It states that in a gaussian channel, the derivative of the mutual information is one-half of the minimum mean-squared error. Higher derivatives of the mutual information is related to estimation errors of higher moments, however a general formula is unknown. In this paper, we derive a general formula for the derivatives of mutual information between inputs and outputs of multiple Gaussian channels with respect to the signal-to-noise ratios. The obtained result is remarkably similar to the classic cumulant-moment relation.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.