Emergent Mind

Variable-Length Source Dispersions Differ under Maximum and Average Error Criteria

(1910.05724)
Published Oct 13, 2019 in cs.IT and math.IT

Abstract

Variable-length compression without prefix-free constraints and with side-information available at both encoder and decoder is considered. Instead of requiring the code to be error-free, we allow for it to have a non-vanishing error probability. We derive one-shot bounds on the optimal average codeword length by proposing two new information quantities; namely, the conditional and unconditional $\varepsilon$-cutoff entropies. Using these one-shot bounds, we obtain the second-order asymptotics of the problem under two different formalismsthe average and maximum probabilities of error over the realization of the side-information. While the first-order terms in the asymptotic expansions for both formalisms are identical, we find that the source dispersion under the average error formalism is, in most cases, strictly smaller than its maximum error counterpart. Applications to a certain class of guessing problems, previously studied by Kuzuoka [\emph{{IEEE} Trans.\ Inf.\ Theory}, vol.~66, no.~3, pp.~1674--1690, 2020], are also discussed.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.