Emergent Mind

Finite-Length Scaling of Polar Codes

(1304.4778)
Published Apr 17, 2013 in cs.IT and math.IT

Abstract

Consider a binary-input memoryless output-symmetric channel $W$. Such a channel has a capacity, call it $I(W)$, and for any $R<I(W)$ and strictly positive constant $P_{\rm e}$ we know that we can construct a coding scheme that allows transmission at rate $R$ with an error probability not exceeding $P_{\rm e}$. Assume now that we let the rate $R$ tend to $I(W)$ and we ask how we have to "scale" the blocklength $N$ in order to keep the error probability fixed to $P_{\rm e}$. We refer to this as the "finite-length scaling" behavior. This question was addressed by Strassen as well as Polyanskiy, Poor and Verdu, and the result is that $N$ must grow at least as the square of the reciprocal of $I(W)-R$. Polar codes are optimal in the sense that they achieve capacity. In this paper, we are asking to what degree they are also optimal in terms of their finite-length behavior. Our approach is based on analyzing the dynamics of the un-polarized channels. The main results of this paper can be summarized as follows. Consider the sum of Bhattacharyya parameters of sub-channels chosen (by the polar coding scheme) to transmit information. If we require this sum to be smaller than a given value $P_{\rm e}>0$, then the required block-length $N$ scales in terms of the rate $R < I(W)$ as $N \geq \frac{\alpha}{(I(W)-R){\underline{\mu}}}$, where $\alpha$ is a positive constant that depends on $P{\rm e}$ and $I(W)$, and $\underline{\mu} = 3.579$. Also, we show that with the same requirement on the sum of Bhattacharyya parameters, the block-length scales in terms of the rate like $N \leq \frac{\beta}{(I(W)-R){\overline{\mu}}}$, where $\beta$ is a constant that depends on $P{\rm e}$ and $I(W)$, and $\overline{\mu}=6$.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.