Emergent Mind

On the entropy of a noisy function

(1508.01464)
Published Aug 6, 2015 in cs.IT , math.CO , and math.IT

Abstract

Let $0 < \epsilon < 1/2$ be a noise parameter, and let $T{\epsilon}$ be the noise operator acting on functions on the boolean cube ${0,1}n$. Let $f$ be a nonnegative function on ${0,1}n$. We upper bound the entropy of $T{\epsilon} f$ by the average entropy of conditional expectations of $f$, given sets of roughly $(1-2\epsilon)2 \cdot n$ variables. In information-theoretic terms, we prove the following strengthening of "Mrs. Gerber's lemma": Let $X$ be a random binary vector of length $n$, and let $Z$ be a noise vector, corresponding to a binary symmetric channel with crossover probability $\epsilon$. Then, setting $v = (1-2\epsilon)2 \cdot n$, we have (up to lower-order terms): $$ H\Big(X \oplus Z\Big) \ge n \cdot H\left(\epsilon ~+~ (1-2\epsilon) \cdot H{-1}\left(\frac{{\mathbb E}{|B| = v} H\Big({Xi}{i\in B}\Big)}{v}\right)\right) $$ As an application, we show that for a boolean function $f$, which is close to a characteristic function $g$ of a subcube of dimension $n-1$, the entropy of $T{\epsilon} f$ is at most that of $T_{\epsilon} g$. This, combined with a recent result of Ordentlich, Shayevitz, and Weinstein shows that the "Most informative boolean function" conjecture of Courtade and Kumar holds for high noise $\epsilon \ge 1/2 - \delta$, for some absolute constant $\delta > 0$. Namely, if $X$ is uniformly distributed in ${0,1}n$ and $Y$ is obtained by flipping each coordinate of $X$ independently with probability $\epsilon$, then, provided $\epsilon \ge 1/2 - \delta$, for any boolean function $f$ holds $I\Big(f(X);Y\Big) \le 1 - H(\epsilon)$.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.