OpenGraph: Towards Open Graph Foundation Models (2403.01121v4)
Abstract: Graph learning has become essential in various domains, including recommendation systems and social network analysis. Graph Neural Networks (GNNs) have emerged as promising techniques for encoding structural information and improving performance in tasks like link prediction and node classification. However, a key challenge remains: the difficulty of generalizing to unseen graph data with different properties. In this work, we propose a novel graph foundation model, called OpenGraph, to address this challenge. Our approach tackles several technical obstacles. Firstly, we enhance data augmentation using a LLM to overcome data scarcity in real-world scenarios. Secondly, we introduce a unified graph tokenizer that enables the model to generalize effectively to diverse graph data, even when encountering unseen properties during training. Thirdly, our developed scalable graph transformer captures node-wise dependencies within the global topological context. Extensive experiments validate the effectiveness of our framework. By adapting OpenGraph to new graph characteristics and comprehending diverse graphs, our approach achieves remarkable zero-shot graph learning performance across various settings. We release the model implementation at https://github.com/HKUDS/OpenGraph.
- Graphllm: Boosting graph reasoning ability of large language model. arXiv preprint arXiv:2310.05845, 2023.
- Generative adversarial framework for cold-start item recommendation. In SIGIR, pages 2565–2571, 2022.
- Simple and deep graph convolutional networks. In ICML, pages 1725–1735. PMLR, 2020.
- Graph unlearning. In SIGSAC, pages 499–513, 2022.
- Universal prompt tuning for graph neural networks. NeurIPS, 2023.
- Talk like a graph: Encoding graphs for large language models. In NeurIPS 2023 Workshop: New Frontiers in Graph Learning, 2023.
- Large-scale learnable graph convolutional networks. In KDD, pages 1416–1424, 2018.
- A. E. Gelfand. Gibbs sampling. Journal of the American statistical Association, 95(452):1300–1304, 2000.
- Good: A graph out-of-distribution benchmark. NeurIPS, 35:2059–2073, 2022.
- Graphedit: Large language models for graph structure learning. arXiv preprint arXiv:2402.15183, 2024.
- Lightgcn: Simplifying and powering graph convolution network for recommendation. In SIGIR, pages 639–648, 2020.
- Heterogeneous graph transformer. In WWW, pages 2704–2710, 2020.
- Universal graph convolutional networks. NeurIPS, 34:10654–10664, 2021.
- Automated self-supervised learning for graphs. In ICLR, 2022.
- Graph structure learning for robust graph neural networks. In KDD, pages 66–74, 2020.
- T. N. Kipf and M. Welling. Semi-supervised classification with graph convolutional networks. 2017.
- Augmentation-free self-supervised learning on graphs. In AAAI, volume 36, pages 7372–7380, 2022.
- Disentangled contrastive learning on graphs. NeurIPS, 34:21872–21884, 2021.
- Efficient graph generation with graph recurrent attention networks. NeurIPS, 32, 2019.
- Graph self-supervised learning: A survey. Transactions on Knowledge and Data Engineering (TKDE), 35(6):5879–5900, 2022.
- Graphprompt: Unifying pre-training and downstream tasks for graph neural networks. In WWW, pages 417–428, 2023.
- Are we really making much progress? revisiting, benchmarking and refining heterogeneous graph neural networks. In KDD, pages 1150–1160, 2021.
- Disentangled multiplex graph representation learning. In ICML, pages 24983–25005. PMLR, 2023.
- Representation learning with large language models for recommendation. arXiv preprint arXiv:2310.15950, 2023.
- Graph neural networks for friend ranking in large-scale social platforms. In WWW, pages 2535–2546, 2021.
- How powerful is graph convolution for recommendation? In CIKM, pages 1619–1629, 2021.
- Gppt: Graph pre-training and prompt tuning to generalize graph neural networks. In KDD, pages 1717–1727, 2022.
- All in one: Multi-task prompting for graph neural networks. In KDD, 2023.
- Graphgpt: Graph instruction tuning for large language models. arXiv preprint arXiv:2310.13023, 2023.
- Graph attention networks. 2018.
- Deep graph infomax. In ICLR, 2018.
- Traffic flow prediction via spatial temporal graph neural network. In WWW, pages 1082–1092, 2020.
- Self-supervised graph learning for recommendation. In SIGIR, pages 726–735, 2021.
- Self-supervised learning on graphs: Contrastive, generative, or predictive. Transactions on Knowledge and Data Engineering (TKDE), 2021.
- Handling distribution shifts on graphs: An invariance perspective. In ICLR, 2021.
- A comprehensive survey on graph neural networks. Transactions on Neural Networks and Learning Systems (TNNLS), 32(1):4–24, 2020.
- Automated self-supervised learning for recommendation. In WWW, pages 992–1002, 2023.
- Decoupled self-supervised learning for graphs. NeurIPS, 35:620–634, 2022.
- How powerful are graph neural networks? In ICLR, 2018.
- Graph pre-training and prompt learning for recommendation. arXiv preprint arXiv:2311.16716, 2023.
- Graph convolutional neural networks for web-scale recommender systems. In KDD, pages 974–983, 2018.
- Graph contrastive learning automated. In ICML, pages 12121–12132. PMLR, 2021.
- Graph contrastive learning with augmentations. NeurIPS, 33:5812–5823, 2020.
- Xgnn: Towards model-level explanations of graph neural networks. In KDD, pages 430–438, 2020.
- Graph transformer networks. NeurIPS, 32, 2019.
- Few-shot heterogeneous graph learning via cross-domain knowledge transfer. In KDD, pages 2450–2460, 2022.
- Graph attention multi-layer perceptron. In KDD, pages 4560–4570, 2022.
- Lorentzian graph convolutional networks. In WWW, pages 1249–1261, 2021.
- E. Zheleva and L. Getoor. Preserving the privacy of sensitive relationships in graph data. In International workshop on privacy, security, and trust in KDD, pages 153–171. Springer, 2007.
- Instant graph neural networks for dynamic graphs. In KDD, pages 2605–2615, 2022.
- Graph contrastive learning with adaptive augmentation. In WWW, pages 2069–2080, 2021.