Papers
Topics
Authors
Recent
Detailed Answer
Quick Answer
Concise responses based on abstracts only
Detailed Answer
Well-researched responses based on abstracts and relevant paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses
Gemini 2.5 Flash
Gemini 2.5 Flash 56 tok/s
Gemini 2.5 Pro 39 tok/s Pro
GPT-5 Medium 15 tok/s Pro
GPT-5 High 16 tok/s Pro
GPT-4o 99 tok/s Pro
Kimi K2 155 tok/s Pro
GPT OSS 120B 476 tok/s Pro
Claude Sonnet 4 38 tok/s Pro
2000 character limit reached

Round Compression for Parallel Graph Algorithms in Strongly Sublinear Space (1807.08745v1)

Published 23 Jul 2018 in cs.DS and cs.DC

Abstract: The Massive Parallel Computation (MPC) model is a theoretical framework for popular parallel and distributed platforms such as MapReduce, Hadoop, or Spark. We consider the task of computing a large matching or small vertex cover in this model when the space per machine is $n\delta$ for $\delta \in (0,1)$, where $n$ is the number of vertices in the input graph. A direct simulation of classic PRAM and distributed algorithms from the 1980s results in algorithms that require at least a logarithmic number of MPC rounds. We give the first algorithm that breaks this logarithmic barrier and runs in $\tilde O(\sqrt{\log n})$ rounds, as long as the total space is at least slightly superlinear in the number of vertices. The result is obtained by repeatedly compressing several rounds of a natural peeling algorithm to a logarithmically smaller number of MPC rounds. Each time we show that it suffices to consider a low-degree subgraph, in which local neighborhoods can be explored with exponential speedup. Our techniques are relatively simple and can also be used to accelerate the simulation of distributed algorithms for bounded-degree graphs and finding a maximal independent set in bounded-arboricity graphs.

Citations (29)
List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.

Summary

We haven't generated a summary for this paper yet.

Dice Question Streamline Icon: https://streamlinehq.com

Follow-Up Questions

We haven't generated follow-up questions for this paper yet.

Authors (1)