The maximum mutual information between the output of a discrete symmetric channel and several classes of Boolean functions of its input (1701.05014v2)
Abstract: We prove the Courtade-Kumar conjecture, for several classes of n-dimensional Boolean functions, for all $n \geq 2$ and for all values of the error probability of the binary symmetric channel, $0 \leq p \leq 1/2$. This conjecture states that the mutual information between any Boolean function of an n-dimensional vector of independent and identically distributed inputs to a memoryless binary symmetric channel and the corresponding vector of outputs is upper-bounded by $1-\operatorname{H}(p)$, where $\operatorname{H}(p)$ represents the binary entropy function. That is, let $\mathbf{X}=[X_1 \ldots X_n]$ be a vector of independent and identically distributed Bernoulli(1/2) random variables, which are the input to a memoryless binary symmetric channel, with the error probability in the interval $0 \leq p \leq 1/2$ and $\mathbf{Y}=[Y_1 \ldots Y_n]$ the corresponding output. Let $f:{0,1}n \rightarrow {0,1}$ be an n-dimensional Boolean function. Then, $\operatorname{MI}(f(X),Y) \leq 1-\operatorname{H}(p)$. Our proof employs Karamata's theorem, concepts from probability theory, transformations of random variables and vectors and algebraic manipulations.
Collections
Sign up for free to add this paper to one or more collections.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.