Papers
Topics
Authors
Recent
Detailed Answer
Quick Answer
Concise responses based on abstracts only
Detailed Answer
Well-researched responses based on abstracts and relevant paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses
Gemini 2.5 Flash
Gemini 2.5 Flash 60 tok/s
Gemini 2.5 Pro 51 tok/s Pro
GPT-5 Medium 18 tok/s Pro
GPT-5 High 14 tok/s Pro
GPT-4o 77 tok/s Pro
Kimi K2 159 tok/s Pro
GPT OSS 120B 456 tok/s Pro
Claude Sonnet 4 38 tok/s Pro
2000 character limit reached

Zero-Delay Sequential Transmission of Markov Sources Over Burst Erasure Channels (1410.2326v1)

Published 9 Oct 2014 in cs.IT and math.IT

Abstract: A setup involving zero-delay sequential transmission of a vector Markov source over a burst erasure channel is studied. A sequence of source vectors is compressed in a causal fashion at the encoder, and the resulting output is transmitted over a burst erasure channel. The destination is required to reconstruct each source vector with zero-delay, but those source sequences that are observed either during the burst erasure, or in the interval of length $W$ following the burst erasure need not be reconstructed. The minimum achievable compression rate is called the rate-recovery function. We assume that each source vector is sampled i.i.d. across the spatial dimension and from a stationary, first-order Markov process across the temporal dimension. For discrete sources the case of lossless recovery is considered, and upper and lower bounds on the rate-recovery function are established. Both these bounds can be expressed as the rate for predictive coding, plus a term that decreases at least inversely with the recovery window length $W$. For Gauss-Markov sources and a quadratic distortion measure, upper and lower bounds on the minimum rate are established when $W=0$. These bounds are shown to coincide in the high resolution limit. Finally another setup involving i.i.d. Gaussian sources is studied and the rate-recovery function is completely characterized in this case.

Citations (14)

Summary

We haven't generated a summary for this paper yet.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.

Lightbulb On Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.