Emergent Mind

Pseudo-Codeword Performance Analysis for LDPC Convolutional Codes

(0609148)
Published Sep 27, 2006 in cs.IT and math.IT

Abstract

Message-passing iterative decoders for low-density parity-check (LDPC) block codes are known to be subject to decoding failures due to so-called pseudo-codewords. These failures can cause the large signal-to-noise ratio performance of message-passing iterative decoding to be worse than that predicted by the maximum-likelihood decoding union bound. In this paper we address the pseudo-codeword problem from the convolutional-code perspective. In particular, we compare the performance of LDPC convolutional codes with that of their ``wrapped'' quasi-cyclic block versions and we show that the minimum pseudo-weight of an LDPC convolutional code is at least as large as the minimum pseudo-weight of an underlying quasi-cyclic code. This result, which parallels a well-known relationship between the minimum Hamming weight of convolutional codes and the minimum Hamming weight of their quasi-cyclic counterparts, is due to the fact that every pseudo-codeword in the convolutional code induces a pseudo-codeword in the block code with pseudo-weight no larger than that of the convolutional code's pseudo-codeword. This difference in the weight spectra leads to improved performance at low-to-moderate signal-to-noise ratios for the convolutional code, a conclusion supported by simulation results.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.