A near-optimal direct-sum theorem for communication complexity
(2008.07188)Abstract
We show a near optimal direct-sum theorem for the two-party randomized communication complexity. Let $f\subseteq X \times Y\times Z$ be a relation, $\varepsilon> 0$ and $k$ be an integer. We show, $$\mathrm{R}{\mathrm{pub}}_\varepsilon(fk) \cdot \log(\mathrm{R}{\mathrm{pub}}_\varepsilon(fk)) \ge \Omega(k \cdot \mathrm{R}{\mathrm{pub}}_\varepsilon(f)) \enspace,$$ where $fk= f \times \ldots \times f$ ($k$-times) and $\mathrm{R}{\mathrm{pub}}_\varepsilon(\cdot)$ represents the public-coin randomized communication complexity with worst-case error $\varepsilon$. Given a protocol $\mathcal{P}$ for $fk$ with communication cost $c \cdot k$ and worst-case error $\varepsilon$, we exhibit a protocol $\mathcal{Q}$ for $f$ with external-information-cost $O(c)$ and worst-error $\varepsilon$. We then use a message compression protocol due to Barak, Braverman, Chen and Rao [2013] for simulating $\mathcal{Q}$ with communication $O(c \cdot \log(c\cdot k))$ to arrive at our result. To show this reduction we show some new chain-rules for capacity, the maximum information that can be transmitted by a communication channel. We use the powerful concept of Nash-Equilibrium in game-theory, and its existence in suitably defined games, to arrive at the chain-rules for capacity. These chain-rules are of independent interest.
We're not able to analyze this paper right now due to high demand.
Please check back later (sorry!).
Generate a summary of this paper on our Pro plan:
We ran into a problem analyzing this paper.