Papers
Topics
Authors
Recent
Detailed Answer
Quick Answer
Concise responses based on abstracts only
Detailed Answer
Well-researched responses based on abstracts and relevant paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses
Gemini 2.5 Flash
Gemini 2.5 Flash 65 tok/s
Gemini 2.5 Pro 47 tok/s Pro
GPT-5 Medium 39 tok/s Pro
GPT-5 High 32 tok/s Pro
GPT-4o 97 tok/s Pro
Kimi K2 164 tok/s Pro
GPT OSS 120B 466 tok/s Pro
Claude Sonnet 4 38 tok/s Pro
2000 character limit reached

Personalized PageRank to a Target Node, Revisited (2006.11876v2)

Published 21 Jun 2020 in cs.DS

Abstract: Personalized PageRank (PPR) is a widely used node proximity measure in graph mining and network analysis. Given a source node $s$ and a target node $t$, the PPR value $\pi(s,t)$ represents the probability that a random walk from $s$ terminates at $t$, and thus indicates the bidirectional importance between $s$ and $t$. The majority of the existing work focuses on the single-source queries, which asks for the PPR value of a given source node $s$ and every node $t \in V$. However, the single-source query only reflects the importance of each node $t$ with respect to $s$. In this paper, we consider the {\em single-target PPR query}, which measures the opposite direction of importance for PPR. Given a target node $t$, the single-target PPR query asks for the PPR value of every node $s\in V$ to a given target node $t$. We propose RBS, a novel algorithm that answers approximate single-target queries with optimal computational complexity. We show that RBS improves three concrete applications: heavy hitters PPR query, single-source SimRank computation, and scalable graph neural networks. We conduct experiments to demonstrate that RBS outperforms the state-of-the-art algorithms in terms of both efficiency and precision on real-world benchmark datasets.

Citations (48)
List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.

Summary

We haven't generated a summary for this paper yet.

Dice Question Streamline Icon: https://streamlinehq.com

Follow-Up Questions

We haven't generated follow-up questions for this paper yet.