Papers
Topics
Authors
Recent
Detailed Answer
Quick Answer
Concise responses based on abstracts only
Detailed Answer
Well-researched responses based on abstracts and relevant paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses
Gemini 2.5 Flash
Gemini 2.5 Flash 63 tok/s
Gemini 2.5 Pro 49 tok/s Pro
GPT-5 Medium 14 tok/s Pro
GPT-5 High 19 tok/s Pro
GPT-4o 100 tok/s Pro
Kimi K2 174 tok/s Pro
GPT OSS 120B 472 tok/s Pro
Claude Sonnet 4 37 tok/s Pro
2000 character limit reached

Complexity and Second Moment of the Mathematical Theory of Communication (2107.06420v1)

Published 13 Jul 2021 in cs.IT and math.IT

Abstract: The performance of an error correcting code is evaluated by its error probability, rate, and en/decoding complexity. The performance of a series of codes is evaluated by, as the block lengths approach infinity, whether their error probabilities decay to zero, whether their rates converge to capacity, and whether their growth in complexities stays under control. Over any discrete memoryless channel, I build codes such that: (1) their error probabilities and rates scale like random codes; and (2) their en/decoding complexities scale like polar codes. Quantitatively, for any constants $p,r>0$ s.t. $p+2r<1$, I construct a series of codes with block length $N$ approaching infinity, error probability $\exp(-Np)$, rate $N{-r}$ less than the capacity, and en/decoding complexity $O(N\log N)$ per block. Over any discrete memoryless channel, I also build codes such that: (1) they achieve capacity rapidly; and (2) their en/decoding complexities outperform all known codes over non-BEC channels. Quantitatively, for any constants $t,r>0$ s.t. $2r<1$, I construct a series of codes with block length $N$ approaching infinity, error probability $\exp(-(\log N)t)$, rate $N{-r}$ less than the capacity, and en/decoding complexity $O(N\log(\log N))$ per block. The two aforementioned results are built upon two pillars: a versatile framework that generates codes on the basis of channel polarization, and a calculus-probability machinery that evaluates the performances of codes. The framework that generates codes and the machinery that evaluates codes can be extended to many other scenarios in network information theory. To name a few: lossless compression, lossy compression, Slepian-Wolf, Wyner-Ziv, multiple access channel, wiretap channel, and broadcast channel. In each scenario, the adapted notions of error probability and rate approach their limits at the same paces as specified above.

Citations (3)
List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.

Summary

We haven't generated a summary for this paper yet.

Dice Question Streamline Icon: https://streamlinehq.com

Follow-Up Questions

We haven't generated follow-up questions for this paper yet.

Authors (1)