PyTorch Geometric High Order: A Unified Library for High Order Graph Neural Network (2311.16670v1)
Abstract: We introduce PyTorch Geometric High Order (PyGHO), a library for High Order Graph Neural Networks (HOGNNs) that extends PyTorch Geometric (PyG). Unlike ordinary Message Passing Neural Networks (MPNNs) that exchange messages between nodes, HOGNNs, encompassing subgraph GNNs and k-WL GNNs, encode node tuples, a method previously lacking a standardized framework and often requiring complex coding. PyGHO's main objective is to provide an unified and user-friendly interface for various HOGNNs. It accomplishes this through streamlined data structures for node tuples, comprehensive data processing utilities, and a flexible suite of operators for high-order GNN methodologies. In this work, we present a detailed in-depth of PyGHO and compare HOGNNs implemented with PyGHO with their official implementation on real-world tasks. PyGHO achieves up to $50\%$ acceleration and reduces the code needed for implementation by an order of magnitude. Our library is available at \url{https://github.com/GraphPKU/PygHO}.
- Equivariant subgraph aggregation networks. In ICLR, 2022.
- Combinatorial optimization and reasoning with graph neural networks. J. Mach. Learn. Res., 24:130–1, 2023.
- Cogdl: A comprehensive library for graph deep learning. In WWW, 2023.
- Fast graph representation learning with pytorch geometric. 2019.
- Understanding and extending subgraph gnns by rethinking their symmetries. In NeurIPS, 2022.
- Neural message passing for quantum chemistry. In ICML, 2017.
- Jraph: A library for graph neural networks in jax., 2020. URL http://github.com/deepmind/jraph.
- Automatic chemical design using a data-driven continuous representation of molecules. 2016.
- Graph neural networks in tensorflow and keras with spektral. IEEE Comput. Intell. Mag., 16(1):99–106, 2021.
- Inductive representation learning on large graphs. In NeurIPS, 2017.
- Efficient graph deep learning in tensorflow with tf_geometric. In MM, pp. 3775–3778, 2021.
- Boosting the cycle counting power of graph neural networks with i22{}^{2}start_FLOATSUPERSCRIPT 2 end_FLOATSUPERSCRIPT-gnns. ICLR, 2023.
- Semi-supervised classification with graph convolutional networks. In ICLR, 2017.
- DIG: A turnkey library for diving into graph deep learning research. J. Mach. Learn. Res., 22:240:1–240:9, 2021.
- Provably powerful graph networks. NeurIPS, 2019a.
- Invariant and equivariant graph networks. In ICLR, 2019b.
- Weisfeiler and leman go neural: Higher-order graph neural networks. In AAAI, 2019.
- Weisfeiler and leman go sparse: Towards scalable higher-order graph embeddings. NeurIPS, 2020.
- Graph neural networks for materials science and chemistry. Communications Materials, 3(1):93, 2022.
- Graph attention networks. arXiv preprint arXiv:1710.10903, 2017.
- Graph neural networks in recommender systems: a survey. ACM Computing Surveys, 55(5):1–37, 2022.
- How powerful are graph neural networks? In ICLR, 2019.
- A complete expressiveness hierarchy for subgraph gnns via subgraph weisfeiler-lehman tests. In ICML, 2023.
- Nested graph neural networks. NeurIPS, 2021.
- Graph neural networks and their current applications in bioinformatics. Frontiers in genetics, 12:690049, 2021.
- From stars to subgraphs: Uplifting any GNN with local structure awareness. In ICLR, 2022.
- Learning graph neural networks with deep graph library. In WWW, pp. 305–306, 2020.