Emergent Mind

Capacity Approximations for Gaussian Relay Networks

(1407.3841)
Published Jul 14, 2014 in cs.IT and math.IT

Abstract

Consider a Gaussian relay network where a source node communicates to a destination node with the help of several layers of relays. Recent work has shown that compress-and-forward based strategies can achieve the capacity of this network within an additive gap. Here, the relays quantize their received signals at the noise level and map them to random Gaussian codebooks. The resultant gap to capacity is independent of the SNRs of the channels in the network and the topology but is linear in the total number of nodes. In this paper, we provide an improved lower bound on the rate achieved by compress-and-forward based strategies (noisy network coding in particular) in arbitrary Gaussian relay networks, whose gap to capacity depends on the network not only through the total number of nodes but also through the degrees of freedom of the min cut of the network. We illustrate that for many networks, this refined lower bound can lead to a better approximation of the capacity. In particular, we demonstrate that it leads to a logarithmic rather than linear capacity gap in the total number of nodes for certain classes of layered networks. The improvement comes from quantizing the received signals of the relays at a resolution decreasing with the total number of nodes in the network. This suggests that the rule-of-thumb in literature of quantizing the received signals at the noise level can be highly suboptimal.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.