A near-optimal direct-sum theorem for communication complexity (2008.07188v3)
Abstract: We show a near optimal direct-sum theorem for the two-party randomized communication complexity. Let $f\subseteq X \times Y\times Z$ be a relation, $\varepsilon> 0$ and $k$ be an integer. We show, $$\mathrm{R}{\mathrm{pub}}_\varepsilon(fk) \cdot \log(\mathrm{R}{\mathrm{pub}}_\varepsilon(fk)) \ge \Omega(k \cdot \mathrm{R}{\mathrm{pub}}_\varepsilon(f)) \enspace,$$ where $fk= f \times \ldots \times f$ ($k$-times) and $\mathrm{R}{\mathrm{pub}}_\varepsilon(\cdot)$ represents the public-coin randomized communication complexity with worst-case error $\varepsilon$. Given a protocol $\mathcal{P}$ for $fk$ with communication cost $c \cdot k$ and worst-case error $\varepsilon$, we exhibit a protocol $\mathcal{Q}$ for $f$ with external-information-cost $O(c)$ and worst-error $\varepsilon$. We then use a message compression protocol due to Barak, Braverman, Chen and Rao [2013] for simulating $\mathcal{Q}$ with communication $O(c \cdot \log(c\cdot k))$ to arrive at our result. To show this reduction we show some new chain-rules for capacity, the maximum information that can be transmitted by a communication channel. We use the powerful concept of Nash-Equilibrium in game-theory, and its existence in suitably defined games, to arrive at the chain-rules for capacity. These chain-rules are of independent interest.
Collections
Sign up for free to add this paper to one or more collections.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.