Optimal Separation and Strong Direct Sum for Randomized Query Complexity (1908.01020v1)
Abstract: We establish two results regarding the query complexity of bounded-error randomized algorithms. * Bounded-error separation theorem. There exists a total function $f : {0,1}n \to {0,1}$ whose $\epsilon$-error randomized query complexity satisfies $\overline{\mathrm{R}}\epsilon(f) = \Omega( \mathrm{R}(f) \cdot \log\frac1\epsilon)$. * Strong direct sum theorem. For every function $f$ and every $k \ge 2$, the randomized query complexity of computing $k$ instances of $f$ simultaneously satisfies $\overline{\mathrm{R}}\epsilon(fk) = \Theta(k \cdot \overline{\mathrm{R}}_{\frac\epsilon k}(f))$. As a consequence of our two main results, we obtain an optimal superlinear direct-sum-type theorem for randomized query complexity: there exists a function $f$ for which $\mathrm{R}(fk) = \Theta( k \log k \cdot \mathrm{R}(f))$. This answers an open question of Drucker (2012). Combining this result with the query-to-communication complexity lifting theorem of G\"o\"os, Pitassi, and Watson (2017), this also shows that there is a total function whose public-coin randomized communication complexity satisfies $\mathrm{R}{\mathrm{cc}} (fk) = \Theta( k \log k \cdot \mathrm{R}{\mathrm{cc}}(f))$, answering a question of Feder, Kushilevitz, Naor, and Nisan (1995).
Collections
Sign up for free to add this paper to one or more collections.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.