Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
124 tokens/sec
GPT-4o
8 tokens/sec
Gemini 2.5 Pro Pro
47 tokens/sec
o3 Pro
5 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

PyTorch Geometric High Order: A Unified Library for High Order Graph Neural Network (2311.16670v1)

Published 28 Nov 2023 in cs.LG

Abstract: We introduce PyTorch Geometric High Order (PyGHO), a library for High Order Graph Neural Networks (HOGNNs) that extends PyTorch Geometric (PyG). Unlike ordinary Message Passing Neural Networks (MPNNs) that exchange messages between nodes, HOGNNs, encompassing subgraph GNNs and k-WL GNNs, encode node tuples, a method previously lacking a standardized framework and often requiring complex coding. PyGHO's main objective is to provide an unified and user-friendly interface for various HOGNNs. It accomplishes this through streamlined data structures for node tuples, comprehensive data processing utilities, and a flexible suite of operators for high-order GNN methodologies. In this work, we present a detailed in-depth of PyGHO and compare HOGNNs implemented with PyGHO with their official implementation on real-world tasks. PyGHO achieves up to $50\%$ acceleration and reduces the code needed for implementation by an order of magnitude. Our library is available at \url{https://github.com/GraphPKU/PygHO}.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (27)
  1. Equivariant subgraph aggregation networks. In ICLR, 2022.
  2. Combinatorial optimization and reasoning with graph neural networks. J. Mach. Learn. Res., 24:130–1, 2023.
  3. Cogdl: A comprehensive library for graph deep learning. In WWW, 2023.
  4. Fast graph representation learning with pytorch geometric. 2019.
  5. Understanding and extending subgraph gnns by rethinking their symmetries. In NeurIPS, 2022.
  6. Neural message passing for quantum chemistry. In ICML, 2017.
  7. Jraph: A library for graph neural networks in jax., 2020. URL http://github.com/deepmind/jraph.
  8. Automatic chemical design using a data-driven continuous representation of molecules. 2016.
  9. Graph neural networks in tensorflow and keras with spektral. IEEE Comput. Intell. Mag., 16(1):99–106, 2021.
  10. Inductive representation learning on large graphs. In NeurIPS, 2017.
  11. Efficient graph deep learning in tensorflow with tf_geometric. In MM, pp.  3775–3778, 2021.
  12. Boosting the cycle counting power of graph neural networks with i22{}^{2}start_FLOATSUPERSCRIPT 2 end_FLOATSUPERSCRIPT-gnns. ICLR, 2023.
  13. Semi-supervised classification with graph convolutional networks. In ICLR, 2017.
  14. DIG: A turnkey library for diving into graph deep learning research. J. Mach. Learn. Res., 22:240:1–240:9, 2021.
  15. Provably powerful graph networks. NeurIPS, 2019a.
  16. Invariant and equivariant graph networks. In ICLR, 2019b.
  17. Weisfeiler and leman go neural: Higher-order graph neural networks. In AAAI, 2019.
  18. Weisfeiler and leman go sparse: Towards scalable higher-order graph embeddings. NeurIPS, 2020.
  19. Graph neural networks for materials science and chemistry. Communications Materials, 3(1):93, 2022.
  20. Graph attention networks. arXiv preprint arXiv:1710.10903, 2017.
  21. Graph neural networks in recommender systems: a survey. ACM Computing Surveys, 55(5):1–37, 2022.
  22. How powerful are graph neural networks? In ICLR, 2019.
  23. A complete expressiveness hierarchy for subgraph gnns via subgraph weisfeiler-lehman tests. In ICML, 2023.
  24. Nested graph neural networks. NeurIPS, 2021.
  25. Graph neural networks and their current applications in bioinformatics. Frontiers in genetics, 12:690049, 2021.
  26. From stars to subgraphs: Uplifting any GNN with local structure awareness. In ICLR, 2022.
  27. Learning graph neural networks with deep graph library. In WWW, pp.  305–306, 2020.
Citations (2)

Summary

We haven't generated a summary for this paper yet.