Papers
Topics
Authors
Recent
Assistant
AI Research Assistant
Well-researched responses based on relevant abstracts and paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses.
Gemini 2.5 Flash
Gemini 2.5 Flash 153 tok/s
Gemini 2.5 Pro 50 tok/s Pro
GPT-5 Medium 20 tok/s Pro
GPT-5 High 28 tok/s Pro
GPT-4o 79 tok/s Pro
Kimi K2 198 tok/s Pro
GPT OSS 120B 428 tok/s Pro
Claude Sonnet 4.5 38 tok/s Pro
2000 character limit reached

Lossy joint source-channel coding in the finite blocklength regime (1209.1317v2)

Published 6 Sep 2012 in cs.IT and math.IT

Abstract: This paper finds new tight finite-blocklength bounds for the best achievable lossy joint source-channel code rate, and demonstrates that joint source-channel code design brings considerable performance advantage over a separate one in the non-asymptotic regime. A joint source-channel code maps a block of $k$ source symbols onto a length$-n$ channel codeword, and the fidelity of reproduction at the receiver end is measured by the probability $\epsilon$ that the distortion exceeds a given threshold $d$. For memoryless sources and channels, it is demonstrated that the parameters of the best joint source-channel code must satisfy $nC - kR(d) \approx \sqrt{nV + k \mathcal V(d)} Q(\epsilon)$, where $C$ and $V$ are the channel capacity and channel dispersion, respectively; $R(d)$ and $\mathcal V(d)$ are the source rate-distortion and rate-dispersion functions; and $Q$ is the standard Gaussian complementary cdf. Symbol-by-symbol (uncoded) transmission is known to achieve the Shannon limit when the source and channel satisfy a certain probabilistic matching condition. In this paper we show that even when this condition is not satisfied, symbol-by-symbol transmission is, in some cases, the best known strategy in the non-asymptotic regime.

Citations (148)

Summary

We haven't generated a summary for this paper yet.

Dice Question Streamline Icon: https://streamlinehq.com

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Lightbulb Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.