Papers
Topics
Authors
Recent
Assistant
AI Research Assistant
Well-researched responses based on relevant abstracts and paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses.
Gemini 2.5 Flash
Gemini 2.5 Flash 126 tok/s
Gemini 2.5 Pro 45 tok/s Pro
GPT-5 Medium 29 tok/s Pro
GPT-5 High 32 tok/s Pro
GPT-4o 127 tok/s Pro
Kimi K2 183 tok/s Pro
GPT OSS 120B 425 tok/s Pro
Claude Sonnet 4.5 38 tok/s Pro
2000 character limit reached

Enhanced Quasi-Maximum Likelihood Decoding of Short LDPC Codes based on Saturation (1810.13111v2)

Published 31 Oct 2018 in cs.IT and math.IT

Abstract: In this paper, we propose an enhanced quasi-maximum likelihood (EQML) decoder for LDPC codes with short block lengths. After the failure of the conventional belief propagation (BP) decoding, the proposed EQML decoder selects unreliable variable nodes (VNs) and saturates their associated channel output values to generate a list of decoder input sequences. Each decoder input sequence in the list is then decoded by the conventional BP decoder to obtain the most likely codeword. To improve the accuracy of selecting unreliable VNs, we propose an edge-wise selection method based on the sign fluctuation of VNs' extrinsic messages. A partial pruning stopping (PPS) rule is also presented to reduce the decoding latency. Simulation results show that the proposed EQML decoder outperforms the conventional BP decoder and the augmented BP decoder for short LDPC codes. It even approaches the performance of ML decoding within 0.3 dB in terms of frame error rate. In addition, the proposed PPS rule achieves a lower decoding latency compared to the list decoding stopping rule.

Citations (6)

Summary

We haven't generated a summary for this paper yet.

Dice Question Streamline Icon: https://streamlinehq.com

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Lightbulb Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.