Emergent Mind

Abstract

We prove the Courtade-Kumar conjecture, which states that the mutual information between any Boolean function of an $n$-dimensional vector of independent and identically distributed inputs to a memoryless binary symmetric channel and the corresponding vector of outputs is upper-bounded by $1-\operatorname{H}(p)$, where $\operatorname{H}(p)$ represents the binary entropy function. That is, let $\mathbf{X}=[X1...Xn]$ be a vector of independent and identically distributed Bernoulli($1/2$) random variables, which are the input to a memoryless binary symmetric channel, with the error probability equal to $0 \leq p \leq 1/2$, and $\mathbf{Y}=[Y1...Yn]$ the corresponding output. Let $f:{0,1}n \rightarrow {0,1}$ be an $n$-dimensional Boolean function. Then, $\operatorname{MI}(f(\mathbf{X}),\mathbf{Y}) \leq 1-\operatorname{H}(p)$. We provide the proof for the most general case of the conjecture, that is for any $n$-dimensional Boolean function $f$ and for any value of the error probability of the binary symmetric channel, $0 \leq p \leq 1/2$. Our proof employs only basic concepts from information theory, probability theory and transformations of random variables and vectors.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.