Papers
Topics
Authors
Recent
Assistant
AI Research Assistant
Well-researched responses based on relevant abstracts and paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses.
Gemini 2.5 Flash
Gemini 2.5 Flash 174 tok/s
Gemini 2.5 Pro 42 tok/s Pro
GPT-5 Medium 25 tok/s Pro
GPT-5 High 23 tok/s Pro
GPT-4o 98 tok/s Pro
Kimi K2 190 tok/s Pro
GPT OSS 120B 443 tok/s Pro
Claude Sonnet 4.5 34 tok/s Pro
2000 character limit reached

Approximate Triangle Counting via Sampling and Fast Matrix Multiplication (2104.08501v2)

Published 17 Apr 2021 in cs.DS

Abstract: There is a trivial $O(\frac{n3}{T})$ time algorithm for approximate triangle counting where $T$ is the number of triangles in the graph and $n$ the number of vertices. At the same time, one may count triangles exactly using fast matrix multiplication in time $\tilde{O}(n\omega)$. Is it possible to get a negative dependency on the number of triangles $T$ while retaining the $n\omega$ dependency on $n$? We answer this question positively by providing an algorithm which runs in time $O\big(\frac{n\omega}{T{\omega - 2}}\big) \cdot \text{poly}(n{o(1)}/\epsilon)$. This is optimal in the sense that as long as the exponent of $T$ is independent of $n, T$, it cannot be improved while retaining the dependency on $n$; this as follows from the lower bound of Eden and Rosenbaum [APPROX/RANDOM 2018]. Our algorithm improves upon the state of the art when $T = \omega(1)$ and $T = o(n)$. We also consider the problem of approximate triangle counting in sparse graphs, parameterizing by the number of edges $m$. The best known algorithm runs in time $\tilde{O}\big(\frac{m{3/2}}{T}\big)$ [Eden et al., SIAM Journal on Computing, 2017]. There is also a well known algorithm for exact triangle counting that runs in time $\tilde{O}(m{2\omega/(\omega + 1)})$. We again get an algorithm that retains the exponent of $m$ while running faster on graphs with larger number of triangles. Specifically, our algorithm runs in time $O\Big(\frac{m{2\omega/(\omega+1)}}{ T{2(\omega-1)/(\omega+1)}}\Big) \cdot \text{poly}(n{o(1)}/\epsilon)$. This is again optimal in the sense that if the exponent of $T$ is to be constant, it cannot be improved without worsening the dependency on $m$. This algorithm improves upon the state of the art when $T = \omega(1)$ and $T = o(\sqrt{m})$.

Citations (8)

Summary

We haven't generated a summary for this paper yet.

Dice Question Streamline Icon: https://streamlinehq.com

Open Questions

We haven't generated a list of open questions mentioned in this paper yet.

Lightbulb Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

Authors (1)

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.