Papers
Topics
Authors
Recent
Assistant
AI Research Assistant
Well-researched responses based on relevant abstracts and paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses.
Gemini 2.5 Flash
Gemini 2.5 Flash 158 tok/s
Gemini 2.5 Pro 49 tok/s Pro
GPT-5 Medium 34 tok/s Pro
GPT-5 High 28 tok/s Pro
GPT-4o 74 tok/s Pro
Kimi K2 199 tok/s Pro
GPT OSS 120B 434 tok/s Pro
Claude Sonnet 4.5 36 tok/s Pro
2000 character limit reached

High Rate Communication over One-Bit Quantized Channels via Deep Learning and LDPC Codes (2003.00081v1)

Published 28 Feb 2020 in cs.IT and math.IT

Abstract: This paper proposes a method for designing error correction codes by combining a known coding scheme with an autoencoder. Specifically, we integrate an LDPC code with a trained autoencoder to develop an error correction code for intractable nonlinear channels. The LDPC encoder shrinks the input space of the autoencoder, which enables the autoencoder to learn more easily. The proposed error correction code shows promising results for one-bit quantization, a challenging case of a nonlinear channel. Specifically, our design gives a waterfall slope bit error rate even with high order modulation formats such as 16-QAM and 64-QAM despite one-bit quantization. This gain is theoretically grounded by proving that the trained autoencoder provides approximately Gaussian distributed data to the LDPC decoder even though the received signal has non-Gaussian statistics due to the one-bit quantization.

Citations (6)

Summary

We haven't generated a summary for this paper yet.

Dice Question Streamline Icon: https://streamlinehq.com

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Lightbulb Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.