Emergent Mind
Tight Bounds for Symmetric Divergence Measures and a Refined Bound for Lossless Source Coding
(1403.7164)
Published Mar 27, 2014
in
cs.IT
,
math.IT
,
and
math.PR
Abstract
Tight bounds for several symmetric divergence measures are derived in terms of the total variation distance. It is shown that each of these bounds is attained by a pair of 2 or 3-element probability distributions. An application of these bounds for lossless source coding is provided, refining and improving a certain bound by Csisz\'{a}r. Another application of these bounds has been recently introduced by Yardi. et al. for channel-code detection.
We're not able to analyze this paper right now due to high demand.
Please check back later (sorry!).
Generate a summary of this paper on our Pro plan:
We ran into a problem analyzing this paper.