Emergent Mind

Abstract

As shown by M\'edard, the capacity of fading channels with imperfect channel-state information (CSI) can be lower-bounded by assuming a Gaussian channel input $X$ with power $P$ and by upper-bounding the conditional entropy $h(X|Y,\hat{H})$ by the entropy of a Gaussian random variable with variance equal to the linear minimum mean-square error in estimating $X$ from $(Y,\hat{H})$. We demonstrate that, using a rate-splitting approach, this lower bound can be sharpened: by expressing the Gaussian input $X$ as the sum of two independent Gaussian variables $X1$ and $X2$ and by applying M\'edard's lower bound first to bound the mutual information between $X1$ and $Y$ while treating $X2$ as noise, and by applying it a second time to the mutual information between $X2$ and $Y$ while assuming $X1$ to be known, we obtain a capacity lower bound that is strictly larger than M\'edard's lower bound. We then generalize this approach to an arbitrary number $L$ of layers, where $X$ is expressed as the sum of $L$ independent Gaussian random variables of respective variances $P{\ell}$, $\ell = 1,\dotsc,L$ summing up to $P$. Among all such rate-splitting bounds, we determine the supremum over power allocations $P\ell$ and total number of layers $L$. This supremum is achieved for $L\to\infty$ and gives rise to an analytically expressible capacity lower bound. For Gaussian fading, this novel bound is shown to converge to the Gaussian-input mutual information as the signal-to-noise ratio (SNR) grows, provided that the variance of the channel estimation error $H-\hat{H}$ tends to zero as the SNR tends to infinity.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.