Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
162 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Accelerating Material Property Prediction using Generically Complete Isometry Invariants (2401.15089v2)

Published 22 Jan 2024 in cs.LG, cs.CG, and physics.comp-ph

Abstract: Periodic material or crystal property prediction using machine learning has grown popular in recent years as it provides a computationally efficient replacement for classical simulation methods. A crucial first step for any of these algorithms is the representation used for a periodic crystal. While similar objects like molecules and proteins have a finite number of atoms and their representation can be built based upon a finite point cloud interpretation, periodic crystals are unbounded in size, making their representation more challenging. In the present work, we adapt the Pointwise Distance Distribution (PDD), a continuous and generically complete isometry invariant for periodic point sets, as a representation for our learning algorithm. The PDD distinguished all (more than 660 thousand) periodic crystals in the Cambridge Structural Database as purely periodic sets of points without atomic types. We develop a transformer model with a modified self-attention mechanism that combines PDD with compositional information via a spatial encoding method. This model is tested on the crystals of the Materials Project and Jarvis-DFT databases and shown to produce accuracy on par with state-of-the-art methods while being several times faster in both training and prediction time.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (58)
  1. Tensorflow: Large-scale machine learning on heterogeneous distributed systems. arXiv preprint arXiv:1603.04467, 2016.
  2. On representing chemical environments. Physical Review B, 87(18):184115, 2013.
  3. Jörg Behler. Atom-centered symmetry functions for constructing high-dimensional neural network potentials. The Journal of chemical physics, 134(7):074106, 2011.
  4. Property prediction of crystalline solids from composition and crystal structure. AIChE Journal, 62(8):2605–2613, 2016. doi: https://doi.org/10.1002/aic.15251.
  5. Moformer: Self-supervised transformer model for metal–organic framework property prediction. Journal of the American Chemical Society, 145(5):2958–2967, Feb 2023. ISSN 0002-7863. doi: 10.1021/jacs.2c11420. URL https://doi.org/10.1021/jacs.2c11420.
  6. Convergence properties of crystal structure prediction by quasi-random sampling. Journal of chemical theory and computation, 12(2):910–924, 2016.
  7. Graph networks as a universal machine learning framework for molecules and crystals. Chemistry of Materials, 31(9):3564–3572, 2019.
  8. A geometric-information-enhanced crystal graph network for predicting properties of materials. Communications Materials, 2(1):1–11, 2021.
  9. Atomistic line graph neural network for improved materials property predictions. npj Computational Materials, 7(1), nov 2021. doi: 10.1038/s41524-021-00650-1.
  10. The joint automated repository for various integrated simulations (jarvis) for data-driven materials design. npj computational materials, 6(1):173, 2020.
  11. Challenges for density functional theory. Chemical reviews, 112(1):289–320, 2012.
  12. CrysXPP: An explainable property predictor for crystalline materials. npj Computational Materials, 8(1):43, Mar 2022. ISSN 2057-3960. doi: 10.1038/s41524-022-00716-8. URL https://doi.org/10.1038/s41524-022-00716-8.
  13. Ralf Drautz. Atomic cluster expansion for accurate and transferable interatomic potentials. Physical Review B, 99(1):014104, 2019.
  14. Se (3) equivariant graph neural networks with complete local frames. In International Conference on Machine Learning, pp. 5583–5608. PMLR, 2022.
  15. Position information in transformers: An overview. Computational Linguistics, 48(3):733–763, 2022.
  16. Benchmarking materials property prediction methods: the matbench test set and automatminer reference algorithm. npj Computational Materials, 6(1):138, 2020.
  17. The density fingerprint of a periodic point set. In 37th International Symposium on Computational Geometry (SoCG 2021), volume 189, 2021.
  18. Multifidelity statistical machine learning for molecular crystal structure prediction. The Journal of Physical Chemistry A, 124(39):8065–8078, 2020.
  19. Se (3)-transformers: 3d roto-translation equivariant attention networks. Advances in neural information processing systems, 33:1970–1981, 2020.
  20. The Materials Project: A materials genome approach to accelerating materials innovation. Applied Physics Letters Materials, 1(1):011002, 2013. ISSN 2166532X. doi: 10.1063/1.4812323.
  21. Chemical accuracy for the van der waals density functional. Journal of Physics: Condensed Matter, 22(2):022201, 2009.
  22. Efficiency of ab-initio total energy calculations for metals and semiconductors using a plane-wave basis set. Computational materials science, 6(1):15–50, 1996.
  23. Quantitative structure–property relationship modeling of diverse materials properties. Chemical Reviews, 112(5):2889–2919, May 2012. ISSN 0009-2665. doi: 10.1021/cr200066h. URL https://doi.org/10.1021/cr200066h.
  24. Efficient approximations of complete interatomic potentials for crystal property prediction. In Andreas Krause, Emma Brunskill, Kyunghyun Cho, Barbara Engelhardt, Sivan Sabato, and Jonathan Scarlett (eds.), Proceedings of the 40th International Conference on Machine Learning, volume 202 of Proceedings of Machine Learning Research, pp.  21260–21287. PMLR, 23–29 Jul 2023. URL https://proceedings.mlr.press/v202/lin23m.html.
  25. Tie-Yan Liu et al. Learning to rank for information retrieval. Foundations and Trends® in Information Retrieval, 3(3):225–331, 2009.
  26. Crystal twins: self-supervised learning for crystalline material property prediction. npj Computational Materials, 8(1):231, Nov 2022. ISSN 2057-3960. doi: 10.1038/s41524-022-00921-5. URL https://doi.org/10.1038/s41524-022-00921-5.
  27. The consistent force field: a documentation, volume 3. Springer Science & Business Media, 2012.
  28. Scalable deeper graph neural networks for high-performance materials property prediction. Patterns, pp.  100491, 2022.
  29. Developing an improved crystal graph convolutional neural network framework for accelerated materials discovery. Physical Review Materials, 4(6):063801, 2020.
  30. Pytorch: An imperative style, high-performance deep learning library. Advances in neural information processing systems, 32, 2019.
  31. A Patterson. Homometric structures. Nature, 143:939–940, 1939.
  32. Modelling organic crystal structures using distributed multipole and polarizability-based model intermolecular potentials. Physical Chemistry Chemical Physics, 12(30):8478–8490, 2010.
  33. Functional materials discovery using energy-structure-function maps. Nature, 543(7647):657–664, March 2017.
  34. Fast predictions of lattice energies by continuous isometry invariants of crystal structures. In Alexei Pozanenko, Sergey Stupnikov, Bernhard Thalheim, Eva Mendez, and Nadezhda Kiselyova (eds.), Data Analytics and Management in Data Intensive Domains, pp.  178–192, Cham, 2022. Springer International Publishing. ISBN 978-3-031-12285-9.
  35. The earth mover’s distance as a metric for image retrieval. International journal of computer vision, 40(2):99, 2000.
  36. Connectivity optimized nested graph networks for crystal structures, 2023.
  37. Mt-cgcnn: Integrating crystal graph convolutional neural network with multitask learning for material property prediction. arXiv preprint arXiv:1811.05660, 2018.
  38. Schnet–a deep learning architecture for molecules and materials. The Journal of Chemical Physics, 148(24):241722, 2018.
  39. John Wiley & Sons, 2011.
  40. Rotation invariant graph neural networks using spin convolutions. arXiv preprint arXiv:2106.09575, 2021.
  41. A practical algorithm for degree-k voronoi domains of three-dimensional periodic point sets. In Lecture Notes in Computer Science (Proceedings of ISVC), volume 13599, 2022.
  42. Structural analysis of molecular materials using the pair distribution function. Chemical Reviews, 122(1):1208–1272, 2021.
  43. Tensor field networks: Rotation-and translation-equivariant neural networks for 3d point clouds. arXiv preprint arXiv:1802.08219, 2018.
  44. Accurate band gaps of semiconductors and insulators with a semilocal exchange-correlation potential. Physical review letters, 102(22):226401, 2009.
  45. Unsupervised word embeddings capture latent knowledge from materials science literature. Nature, 571(7763):95–98, 2019.
  46. Attention is all you need. Advances in neural information processing systems, 30, 2017.
  47. Compositionally restricted attention-based network for materials property predictions. Npj Computational Materials, 7(1):77, 2021.
  48. Including crystal structure attributes in machine learning models of formation energies via voronoi tessellations. Physical Review B, 96(2):024104, 2017.
  49. Resolving the data ambiguity for periodic crystals. Advances in Neural Information Processing Systems (Proceedings of NeurIPS 2022), 35, 2022.
  50. Average minimum distances of periodic point sets - fundamental invariants for mapping all periodic crystals. MATCH Communications in Mathematical and in Computer Chemistry, 87:529–559, 2022.
  51. Crystal graph convolutional neural networks for an accurate and interpretable prediction of material properties. Physical Review Letters, 120:145301, Apr 2018. doi: 10.1103/PhysRevLett.120.145301. URL https://link.aps.org/doi/10.1103/PhysRevLett.120.145301.
  52. Crystal diffusion variational autoencoder for periodic material generation. arXiv preprint arXiv:2110.06197, 2021.
  53. On layer normalization in the transformer architecture. In International Conference on Machine Learning, pp. 10524–10533. PMLR, 2020.
  54. Periodic graph transformers for crystal material property prediction. In Alice H. Oh, Alekh Agarwal, Danielle Belgrave, and Kyunghyun Cho (eds.), Advances in Neural Information Processing Systems, 2022. URL https://openreview.net/forum?id=pqCT3L-BU9T.
  55. Deep neural networks for accurate predictions of crystal stability. Nature communications, 9(1):3800–3800, Sep 2018. ISSN 2041-1723. doi: 10.1038/s41467-018-06322-x.
  56. Do transformers really perform badly for graph representation? Advances in Neural Information Processing Systems, 34:28877–28888, 2021.
  57. Point transformer. In Proceedings of the IEEE/CVF international conference on computer vision, pp.  16259–16268, 2021.
  58. Uni-mol: A universal 3d molecular representation learning framework. 2023.
Citations (4)

Summary

We haven't generated a summary for this paper yet.