Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
139 tokens/sec
GPT-4o
47 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

A discrepancy lower bound for information complexity (1112.2000v3)

Published 9 Dec 2011 in cs.CC

Abstract: This paper provides the first general technique for proving information lower bounds on two-party unbounded-rounds communication problems. We show that the discrepancy lower bound, which applies to randomized communication complexity, also applies to information complexity. More precisely, if the discrepancy of a two-party function $f$ with respect to a distribution $\mu$ is $Disc_\mu f$, then any two party randomized protocol computing $f$ must reveal at least $\Omega(\log (1/Disc_\mu f))$ bits of information to the participants. As a corollary, we obtain that any two-party protocol for computing a random function on ${0,1}n\times{0,1}n$ must reveal $\Omega(n)$ bits of information to the participants. In addition, we prove that the discrepancy of the Greater-Than function is $\Omega(1/\sqrt{n})$, which provides an alternative proof to the recent proof of Viola \cite{Viola11} of the $\Omega(\log n)$ lower bound on the communication complexity of this well-studied function and, combined with our main result, proves the tight $\Omega(\log n)$ lower bound on its information complexity. The proof of our main result develops a new simulation procedure that may be of an independent interest. In a very recent breakthrough work of Kerenidis et al. \cite{kerenidis2012lower}, this simulation procedure was the main building block for proving that almost all known lower bound techniques for communication complexity (and not just discrepancy) apply to information complexity.

Citations (45)

Summary

We haven't generated a summary for this paper yet.