Papers
Topics
Authors
Recent
Assistant
AI Research Assistant
Well-researched responses based on relevant abstracts and paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses.
Gemini 2.5 Flash
Gemini 2.5 Flash 165 tok/s
Gemini 2.5 Pro 47 tok/s Pro
GPT-5 Medium 28 tok/s Pro
GPT-5 High 24 tok/s Pro
GPT-4o 112 tok/s Pro
Kimi K2 208 tok/s Pro
GPT OSS 120B 466 tok/s Pro
Claude Sonnet 4.5 36 tok/s Pro
2000 character limit reached

Inductive Subgraph Embedding for Link Prediction (2112.01165v2)

Published 2 Dec 2021 in cs.SI

Abstract: Graph representation learning (GRL) has emerged as a powerful technique for solving graph analytics tasks. It can effectively convert discrete graph data into a low-dimensional space where the graph structural information and graph properties are maximumly preserved. While there is rich literature on node and whole-graph representation learning, GRL for link is relatively less studied and less understood. One common practice in previous works is to generate link representations by directly aggregating the representations of their incident nodes, which is not capable of capturing effective link features. Moreover, common GRL methods usually rely on full-graph training, suffering from poor scalability and high resource consumption on large-scale graphs. In this paper, we design Subgraph Contrastive Link Representation Learning (SCLRL) -- a self-supervised link embedding framework, which utilizes the strong correlation between central links and their neighborhood subgraphs to characterize links. We extract the "link-centric induced subgraphs" as input, with a subgraph-level contrastive discrimination as pretext task, to learn the intrinsic and structural link features via subgraph mini-batch training. Extensive experiments conducted on five datasets demonstrate that SCLRL has significant performance advantages in link representation learning on benchmark datasets and prominent efficiency advantages in terms of training speed and memory consumption on large-scale graphs, when compared with existing link representation learning methods.

Summary

We haven't generated a summary for this paper yet.

Lightbulb Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.