Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
162 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

PerCNet: Periodic Complete Representation for Crystal Graphs (2312.14936v1)

Published 3 Dec 2023 in cond-mat.mtrl-sci, cs.AI, and cs.LG

Abstract: Crystal material representation is the foundation of crystal material research. Existing works consider crystal molecules as graph data with different representation methods and leverage the advantages of techniques in graph learning. A reasonable crystal representation method should capture the local and global information. However, existing methods only consider the local information of crystal molecules by modeling the bond distance and bond angle of first-order neighbors of atoms, which leads to the issue that different crystals will have the same representation. To solve this many-to-one issue, we consider the global information by further considering dihedral angles, which can guarantee that the proposed representation corresponds one-to-one with the crystal material. We first propose a periodic complete representation and calculation algorithm for infinite extended crystal materials. A theoretical proof for the representation that satisfies the periodic completeness is provided. Based on the proposed representation, we then propose a network for predicting crystal material properties, PerCNet, with a specially designed message passing mechanism. Extensive experiments are conducted on two real-world material benchmark datasets. The PerCNet achieves the best performance among baseline methods in terms of MAE. In addition, our results demonstrate the importance of the periodic scheme and completeness for crystal representation learning.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (42)
  1. Crystal graph convolutional neural networks for an accurate and interpretable prediction of material properties. Physical review letters, 120(14):145301, 2018.
  2. Graph networks as a universal machine learning framework for molecules and crystals. Chemistry of Materials, 31(9):3564–3572, 2019.
  3. Atomistic line graph neural network for improved materials property predictions. npj Computational Materials, 7(1):185, 2021.
  4. Periodic graph transformers for crystal material property prediction. Advances in Neural Information Processing Systems, 35:15066–15080, 2022.
  5. Efficient approximations of complete interatomic potentials for crystal property prediction. arXiv preprint arXiv:2306.10045, 2023.
  6. Physics guided deep learning for generative design of crystal materials with symmetry constraints. npj Computational Materials, 9(1):38, 2023.
  7. Crystal diffusion variational autoencoder for periodic material generation. In International Conference on Learning Representations, 2021.
  8. Advanced graph and sequence neural networks for molecular property prediction and drug discovery. Bioinformatics, 38(9):2579–2586, 2022.
  9. Protein function prediction via graph kernels. Bioinformatics, 21(suppl_1):i47–i56, 2005.
  10. Protein interface prediction using graph convolutional networks. Advances in neural information processing systems, 30, 2017.
  11. Deep learning of high-order interactions for protein interface prediction. In Proceedings of the 26th ACM SIGKDD international conference on knowledge discovery & data mining, pages 679–687, 2020.
  12. Geometric transformers for protein interface contact prediction. arXiv preprint arXiv:2110.02423, 2021.
  13. Highly accurate protein structure prediction with alphafold. Nature, 596(7873):583–589, 2021.
  14. Multitask joint strategies of self-supervised representation learning on biomedical networks for drug discovery. Nature Machine Intelligence, 5(4):445–456, 2023.
  15. Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems, 34:6790–6802, 2021.
  16. Graph convolutional neural networks with global attention for improved materials property prediction. Physical Chemistry Chemical Physics, 22(32):18141–18148, 2020.
  17. The joint automated repository for various integrated simulations (jarvis) for data-driven materials design. npj computational materials, 6(1):173, 2020.
  18. The materials project: A materials genome approach to accelerating materials innovation, apl mater. 2013.
  19. Convolutional networks on graphs for learning molecular fingerprints. Advances in neural information processing systems, 28, 2015.
  20. Moleculenet: a benchmark for molecular machine learning. Chemical science, 9(2):513–530, 2018.
  21. Graph u-nets. In international conference on machine learning, pages 2083–2092. PMLR, 2019.
  22. Topology-aware graph pooling networks. IEEE Transactions on Pattern Analysis and Machine Intelligence, 43(12):4512–4518, 2021.
  23. Dig: A turnkey library for diving into graph deep learning research. The Journal of Machine Learning Research, 22(1):10873–10881, 2021.
  24. How powerful are graph neural networks? arXiv preprint arXiv:1810.00826, 2018.
  25. An end-to-end deep learning architecture for graph classification. In Proceedings of the AAAI conference on artificial intelligence, volume 32, 2018.
  26. Fast and uncertainty-aware directional message passing for non-equilibrium molecules. arXiv preprint arXiv:2011.14115, 2020.
  27. Rotation invariant graph neural networks using spin convolutions. arXiv preprint arXiv:2106.09575, 2021.
  28. Neural message passing for quantum chemistry. In International conference on machine learning, pages 1263–1272. PMLR, 2017.
  29. Learning to simulate complex physics with graph networks. In International conference on machine learning, pages 8459–8468. PMLR, 2020.
  30. Building powerful and equivariant graph neural networks with structural message-passing. Advances in neural information processing systems, 33:14143–14155, 2020.
  31. Relational inductive biases, deep learning, and graph networks. arXiv preprint arXiv:1806.01261, 2018.
  32. Spherical message passing for 3d molecular graphs. In International Conference on Learning Representations, 2021.
  33. Comenet: Towards complete and efficient message passing for 3d molecular graphs. Advances in Neural Information Processing Systems, 35:650–664, 2022.
  34. Machine learning in materials informatics: recent applications and prospects. npj Computational Materials, 3(1):54, 2017.
  35. Combinatorial screening for new materials in unconstrained composition space with machine learning. Physical Review B, 89(9):094104, 2014.
  36. High-throughput machine-learning-driven synthesis of full-heusler compounds. Chemistry of Materials, 28(20):7324–7331, 2016.
  37. Machine-learning-assisted materials discovery using failed experiments. Nature, 533(7601):73–76, 2016.
  38. A general-purpose machine learning framework for predicting properties of inorganic materials. npj Computational Materials, 2(1):1–7, 2016.
  39. Directional message passing for molecular graphs. arXiv preprint arXiv:2003.03123, 2020.
  40. Schnet: A continuous-filter convolutional neural network for modeling quantum interactions. Advances in neural information processing systems, 30, 2017.
  41. Adam: A method for stochastic optimization. arXiv preprint arXiv:1412.6980, 2014.
  42. Super-convergence: Very fast training of neural networks using large learning rates. In Artificial intelligence and machine learning for multi-domain operations applications, volume 11006, pages 369–386. SPIE, 2019.

Summary

We haven't generated a summary for this paper yet.

X Twitter Logo Streamline Icon: https://streamlinehq.com