Emergent Mind

Modulus Computational Entropy

(1302.2128)
Published Feb 8, 2013 in cs.IT , cs.CR , and math.IT

Abstract

The so-called {\em leakage-chain rule} is a very important tool used in many security proofs. It gives an upper bound on the entropy loss of a random variable $X$ in case the adversary who having already learned some random variables $Z{1},\ldots,Z{\ell}$ correlated with $X$, obtains some further information $Z{\ell+1}$ about $X$. Analogously to the information-theoretic case, one might expect that also for the \emph{computational} variants of entropy the loss depends only on the actual leakage, i.e. on $Z{\ell+1}$. Surprisingly, Krenn et al.\ have shown recently that for the most commonly used definitions of computational entropy this holds only if the computational quality of the entropy deteriorates exponentially in $|(Z{1},\ldots,Z{\ell})|$. This means that the current standard definitions of computational entropy do not allow to fully capture leakage that occurred "in the past", which severely limits the applicability of this notion. As a remedy for this problem we propose a slightly stronger definition of the computational entropy, which we call the \emph{modulus computational entropy}, and use it as a technical tool that allows us to prove a desired chain rule that depends only on the actual leakage and not on its history. Moreover, we show that the modulus computational entropy unifies other,sometimes seemingly unrelated, notions already studied in the literature in the context of information leakage and chain rules. Our results indicate that the modulus entropy is, up to now, the weakest restriction that guarantees that the chain rule for the computational entropy works. As an example of application we demonstrate a few interesting cases where our restricted definition is fulfilled and the chain rule holds.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.