Emergent Mind

Equivocations, Exponents and Second-Order Coding Rates under Various Rényi Information Measures

(1504.02536)
Published Apr 10, 2015 in cs.IT , cs.CR , and math.IT

Abstract

We evaluate the asymptotics of equivocations, their exponents as well as their second-order coding rates under various R\'{e}nyi information measures. Specifically, we consider the effect of applying a hash function on a source and we quantify the level of non-uniformity and dependence of the compressed source from another correlated source when the number of copies of the sources is large. Unlike previous works that use Shannon information measures to quantify randomness, information or uniformity, we define our security measures in terms of a more general class of information measures--the R\'{e}nyi information measures and their Gallager-type counterparts. A special case of these R\'{e}nyi information measure is the class of Shannon information measures. We prove tight asymptotic results for the security measures and their exponential rates of decay. We also prove bounds on the second-order asymptotics and show that these bounds match when the magnitudes of the second-order coding rates are large. We do so by establishing new classes non-asymptotic bounds on the equivocation and evaluating these bounds using various probabilistic limit theorems asymptotically.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.