Papers
Topics
Authors
Recent
2000 character limit reached

The maximum mutual information between the output of a binary symmetric channel and a Boolean function of its input (1604.05113v2)

Published 18 Apr 2016 in cs.IT and math.IT

Abstract: We prove the Courtade-Kumar conjecture, which states that the mutual information between any Boolean function of an $n$-dimensional vector of independent and identically distributed inputs to a memoryless binary symmetric channel and the corresponding vector of outputs is upper-bounded by $1-\operatorname{H}(p)$, where $\operatorname{H}(p)$ represents the binary entropy function. That is, let $\mathbf{X}=[X_1...X_n]$ be a vector of independent and identically distributed Bernoulli($1/2$) random variables, which are the input to a memoryless binary symmetric channel, with the error probability equal to $0 \leq p \leq 1/2$, and $\mathbf{Y}=[Y_1...Y_n]$ the corresponding output. Let $f:{0,1}n \rightarrow {0,1}$ be an $n$-dimensional Boolean function. Then, $\operatorname{MI}(f(\mathbf{X}),\mathbf{Y}) \leq 1-\operatorname{H}(p)$. We provide the proof for the most general case of the conjecture, that is for any $n$-dimensional Boolean function $f$ and for any value of the error probability of the binary symmetric channel, $0 \leq p \leq 1/2$. Our proof employs only basic concepts from information theory, probability theory and transformations of random variables and vectors.

Summary

We haven't generated a summary for this paper yet.

Slide Deck Streamline Icon: https://streamlinehq.com

Whiteboard

Dice Question Streamline Icon: https://streamlinehq.com

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Lightbulb Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

Authors (1)

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.