Papers
Topics
Authors
Recent
Assistant
AI Research Assistant
Well-researched responses based on relevant abstracts and paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses.
Gemini 2.5 Flash
Gemini 2.5 Flash 134 tok/s
Gemini 2.5 Pro 41 tok/s Pro
GPT-5 Medium 27 tok/s Pro
GPT-5 High 27 tok/s Pro
GPT-4o 84 tok/s Pro
Kimi K2 192 tok/s Pro
GPT OSS 120B 434 tok/s Pro
Claude Sonnet 4.5 37 tok/s Pro
2000 character limit reached

A Simple Proof of a New Set Disjointness with Applications to Data Streams (2105.11338v1)

Published 24 May 2021 in cs.DS

Abstract: The multiplayer promise set disjointness is one of the most widely used problems from communication complexity in applications. In this problem there are $k$ players with subsets $S1, \ldots, Sk$, each drawn from ${1, 2, \ldots, n}$, and we are promised that either the sets are (1) pairwise disjoint, or (2) there is a unique element $j$ occurring in all the sets, which are otherwise pairwise disjoint. The total communication of solving this problem with constant probability in the blackboard model is $\Omega(n/k)$. We observe for most applications, it instead suffices to look at what we call the ``mostly'' set disjointness problem, which changes case (2) to say there is a unique element $j$ occurring in at least half of the sets, and the sets are otherwise disjoint. This change gives us a much simpler proof of an $\Omega(n/k)$ randomized total communication lower bound, avoiding Hellinger distance and Poincare inequalities. Using this we show several new results for data streams: \begin{itemize} \item for $\ell_2$-Heavy Hitters, any $O(1)$-pass streaming algorithm in the insertion-only model for detecting if an $\eps$-$\ell_2$-heavy hitter exists requires $\min(\frac{1}{\eps2}\log \frac{\eps2n}{\delta}, \frac{1}{\eps}n{1/2})$ bits of memory, which is optimal up to a $\log n$ factor. For deterministic algorithms and constant $\eps$, this gives an $\Omega(n{1/2})$ lower bound, improving the prior $\Omega(\log n)$ lower bound. We also obtain lower bounds for Zipfian distributions. \item for $\ell_p$-Estimation, $p > 2$, we show an $O(1)$-pass $\Omega(n{1-2/p} \log(1/\delta))$ bit lower bound for outputting an $O(1)$-approximation with probability $1-\delta$, in the insertion-only model. This is optimal, and the best previous lower bound was $\Omega(n{1-2/p} + \log(1/\delta))$. \end{itemize}

Citations (14)

Summary

We haven't generated a summary for this paper yet.

Dice Question Streamline Icon: https://streamlinehq.com

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Lightbulb Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.