Papers
Topics
Authors
Recent
Assistant
AI Research Assistant
Well-researched responses based on relevant abstracts and paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses.
Gemini 2.5 Flash
Gemini 2.5 Flash 172 tok/s
Gemini 2.5 Pro 46 tok/s Pro
GPT-5 Medium 27 tok/s Pro
GPT-5 High 32 tok/s Pro
GPT-4o 99 tok/s Pro
Kimi K2 203 tok/s Pro
GPT OSS 120B 447 tok/s Pro
Claude Sonnet 4.5 37 tok/s Pro
2000 character limit reached

Fully-Dynamic Graph Sparsifiers Against an Adaptive Adversary (2004.08432v3)

Published 17 Apr 2020 in cs.DS

Abstract: Designing dynamic graph algorithms against an adaptive adversary is a major goal in the field of dynamic graph algorithms. While a few such algorithms are known for spanning trees, matchings, and single-source shortest paths, very little was known for an important primitive like graph sparsifiers. The challenge is how to approximately preserve so much information about the graph (e.g., all-pairs distances and all cuts) without revealing the algorithms' underlying randomness to the adaptive adversary. In this paper we present the first non-trivial efficient adaptive algorithms for maintaining spanners and cut sparisifers. These algorithms in turn imply improvements over existing algorithms for other problems. Our first algorithm maintains a polylog$(n)$-spanner of size $\tilde O(n)$ in polylog$(n)$ amortized update time. The second algorithm maintains an $O(k)$-approximate cut sparsifier of size $\tilde O(n)$ in $\tilde O(n{1/k})$ amortized update time, for any $k\ge1$, which is polylog$(n)$ time when $k=\log(n)$. The third algorithm maintains a polylog$(n)$-approximate spectral sparsifier in polylog$(n)$ amortized update time. The amortized update time of both algorithms can be made worst-case by paying some sub-polynomial factors. Prior to our result, there were near-optimal algorithms against oblivious adversaries (e.g. Baswana et al. [TALG'12] and Abraham et al. [FOCS'16]), but the only non-trivial adaptive dynamic algorithm requires $O(n)$ amortized update time to maintain $3$- and $5$-spanner of size $O(n{1+1/2})$ and $O(n{1+1/3})$, respectively [Ausiello et al. ESA'05]. Our results are based on two novel techniques. The first technique, is a generic black-box reduction that allows us to assume that the graph undergoes only edge deletions and, more importantly, remains an expander with almost-uniform degree. The second technique we call proactive resampling. [...]

Citations (47)

Summary

We haven't generated a summary for this paper yet.

Lightbulb Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.

Don't miss out on important new AI/ML research

See which papers are being discussed right now on X, Reddit, and more:

“Emergent Mind helps me see which AI papers have caught fire online.”

Philip

Philip

Creator, AI Explained on YouTube