Emergent Mind

Abstract

Consider the transmission of a polar code of block length $N$ and rate $R$ over a binary memoryless symmetric channel $W$ and let $Pe$ be the block error probability under successive cancellation decoding. In this paper, we develop new bounds that characterize the relationship of the parameters $R$, $N$, $Pe$, and the quality of the channel $W$ quantified by its capacity $I(W)$ and its Bhattacharyya parameter $Z(W)$. In previous work, two main regimes were studied. In the error exponent regime, the channel $W$ and the rate $R<I(W)$ are fixed, and it was proved that the error probability $Pe$ scales roughly as $2{-\sqrt{N}}$. In the scaling exponent approach, the channel $W$ and the error probability $Pe$ are fixed and it was proved that the gap to capacity $I(W)-R$ scales as $N{-1/\mu}$. Here, $\mu$ is called scaling exponent and this scaling exponent depends on the channel $W$. A heuristic computation for the binary erasure channel (BEC) gives $\mu=3.627$ and it was shown that, for any channel $W$, $3.579 \le \mu \le 5.702$. Our contributions are as follows. First, we provide the tighter upper bound $\mu \le 4.714$ valid for any $W$. With the same technique, we obtain $\mu \le 3.639$ for the case of the BEC, which approaches very closely its heuristically derived value. Second, we develop a trade-off between the gap to capacity $I(W)-R$ and the error probability $Pe$ as functions of the block length $N$. In other words, we consider a moderate deviations regime in which we study how fast both quantities, as functions of the block length $N$, simultaneously go to $0$. Third, we prove that polar codes are not affected by error floors. To do so, we fix a polar code of block length $N$ and rate $R$. Then, we vary the channel $W$ and we show that the error probability $Pe$ scales as the Bhattacharyya parameter $Z(W)$ raised to a power that scales roughly like $\sqrt{N}$.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.