Review Article | | Peer-Reviewed

Recent Advances and Applications of Graph Convolution Neural Network Methods in Materials Science

Received: 6 April 2024    Accepted: 22 April 2024    Published: 29 April 2024
Views:       Downloads:
Abstract

With the development of artificial intelligence (AI), AI plus science is increasingly valued, presenting new perspectives to scientific research. The research on using machine learning (including deep learning) to discover patterns from data and predict targeted material properties has received widespread attention, which will have a profound impact in material science studies. In recent years, there has been an increased interest in the use of deep learning in materials science, which has led to significant progress in both fundamental and applied research. One of the most notable advancements is the development of graph convolutional neural network models, which combine graph neural networks and convolutional neural networks to achieve outstanding results in materials science and bridge effectively the deep learning models and material properties predictions. The availability of large materials databases due to the rise of big data has further enhanced the relevance of these models in the field. We present, in this article, a comprehensive overview of graph convolutional neural network models, explaining their fundamental principles and highlighting a few examples of their applications in materials science, as well as current trends. The limitations and challenges that these models face, as well as the potential for future research in this dynamic area are also discussed.

Published in Advances in Applied Sciences (Volume 9, Issue 2)
DOI 10.11648/j.aas.20240902.11
Page(s) 17-30
Creative Commons

This is an Open Access article, distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution and reproduction in any medium or format, provided the original work is properly cited.

Copyright

Copyright © The Author(s), 2024. Published by Science Publishing Group

Keywords

Materials Science, Deep Learning, Graph Convolutional Neural Network

References
[1] Nantasenamat, C., Isarankura-Na-Ayudhya, C., Naenna, T., Prachayasittikul, V. A Practical Overview of Quantitative Structure-activity Relationship. EXCLI Journal. 2009, 8, 74-88.
[2] Hohenberg, P., Kohn, W. Inhomogeneous Electron Gas. Physical Review. 1964, 136(3B): B864.
[3] Kohn, W., Sham, L. J. Self-consistent Equations Including Exchange and Correlation Effects. Physical Review. 1965, 140(4A): A1133.
[4] Alder, B. J., Wainwright, T. E. Studies in Molecular Dynamics. I. General Method. The Journal of Chemical Physics. 1959, 31(2): 459-466.
[5] Rahman, A. Correlations in the Motion of Atoms in Liquid Argon. Physical Review. 1964, 136(2A): A405.
[6] Acioli, P. H. Review of Quantum Monte Carlo Methods and Their Applications. Journal of Molecular Structure: THEOCHEM. 1997, 394(2-3): 75-85.
[7] Jain, A., Ong, S. P., Hautier, G., Chen, W., Richards, W. D., Dacek, S., Cholia, S., Gunter, D., Skinner, D., Ceder, G., Persson, K. A. Commentary: The Materials Project: A materials Genome Approach to Accelerating Materials Innovation. APL Materials. 2013, 1(1).
[8] Belsky, A., Hellenbrandt, M., Karen, V. L., Luksch, P. New Developments in the Inorganic Crystal Structure Database (ICSD): Accessibility in Support of Materials Research and Design. Acta Crystallographica Section B: Structural Science. 2002, 58(3): 364-369.
[9] Zhang, T., Jiang, Y., Song, Z., Huang, H., He, Y., Fang, Z., Weng, H., Fang, C. Catalogue of Topological Electronic Materials. Nature. 2019, 566(7745): 475-479.
[10] Seko, A., Togo, A., Hayashi, H., Tsuda, K., Chaput, L., Tanaka, I. Prediction of Low-thermal-conductivity Compounds with First-principles Anharmonic Lattice-dynamics Calculations and Bayesian Optimization. Physical Review Letters. 2015, 115(20): 205901.
[11] Faber, F. A., Lindmaa, A., Von Lilienfeld, O. A., Armiento, R. Machine Learning Energies of 2 Million Elpasolite (ABC2D6) Crystals. Physical Review Letters. 2016, 117(13): 135502.
[12] Xue, D., Balachandran, P. V., Hogden, J., Theiler, J., Xue, D., Lookman, T. Accelerated Search for Materials with Targeted Properties by Adaptive Design. Nature Communications. 2016, 7(1): 1-9.
[13] LeCun, Y., Bengio, Y., Hinton, G. Deep Learning. Nature. 2015, 521(7553): 436-444.
[14] McCulloch, W. S., Pitts, W. A Logical Calculus of the Ideas Immanent in Nervous Activity. The Bulletin of Mathematical Biophysics. 1943, 5: 115-133.
[15] Rosenblatt, F. The Perceptron: A Probabilistic Model for Information Storage and Organization in the Brain. Psychological Review. 1958, 65(6): 386.
[16] Chen, C., Ye, W., Zuo, Y., Zheng, C., Ong, S. P. Graph Networks as A Universal Machine Learning Framework for Molecules and Crystals. Chemistry of Materials. 2019, 31(9): 3564-3572.
[17] Choubisa, H., Todorović, P., Pina, J. M., Parmar, D. H., Li, Z., Voznyy, O., Tamblyn, I., Sargent, E. H. Interpretable Discovery of Semiconductors with Machine Learning. npj Computational Materials. Year, Volume (Issue), 2023, 9(1): 117.
[18] Chen, C., Zuo, Y., Ye, W., Li, X., Ong, S. P. Learning Properties of Ordered and Disordered Materials from Multi-fidelity Data. Nature Computational Science. 2021, 1(1): 46-53.
[19] Holm, E. A. In Defense of the Black Box. Science. Science. 2019, 364(6435): 26-27.
[20] Xie, T., Grossman, J. C. Crystal Graph Convolutional Neural Networks for An Accurate and Interpretable Prediction of Material Properties. Physical Review Letters. 2018, 120(14): 145301.
[21] Schütt, K. T., Kessel, P., Gastegger, M., Nicoli, K. A., Tkatchenko, A., Muller, K. R. SchNetPack: A Deep Learning Toolbox for Atomistic Systems. Journal of Chemical Theory and Computation. 2018, 15(1): 448-455.
[22] Choudhary, K., DeCost, B. Atomistic Line graph Neural Network for Improved Materials Property Predictions. npj Computational Materials. 2021, 7(1): 185.
[23] Nwankpa, C., Ijomah, W., Gachagan, A., Marshall, S. Activation Functions: Comparison of Trends in Practice and Research for Deep Learning. Arxiv Preprint. 2018, arxiv: 1811.03378.
[24] Rumelhart, D. E., Hinton, G. E., Williams, R. J. Learning Representations by Back-propagating Errors. Nature. 1986, 323(6088): 533-536.
[25] Kipf, T. N., Welling, M. Semi-supervised Classification with Graph Convolutional Networks. Arxiv Preprint. 2016, arxiv: 1609.02907.
[26] Velickovic, P., Cucurull, G., Casanova, A., Romero, A., Lio, P., Bengio, Y. Graph Attention Networks. Arxiv Preprint. 2017, arxiv: 1710.10903.
[27] Vashishth, S., Sanyal, S., Nitin, V., Talukdar, P. Composition-based Multi-relational Graph Convolutional Networks. Arxiv Preprint. 2019, arXiv: 1911.03082.
[28] Hajiramezanali, E., Hasanzadeh, A., Narayanan, K., Duffield, N., Zhou, M., Qian, X. Variational Graph Recurrent Neural Networks. Advances in Neural Information Processing Systems. 2019, 32.
[29] Xu, K., Hu, W., Leskovec, J., Jegelka, S. How Powerful Are Graph Neural Networks?. Arxiv Preprint. 2018, arXiv: 1810.00826.
[30] Chen, Z., Li, X., Bruna, J. Supervised Community Detection with Line Graph Neural Networks. Arxiv Preprint. 2017, arXiv: 1705.08415.
[31] Krizhevsky, A., Sutskever, I., Hinton, G. E. Imagenet Classification with Deep Convolutional Neural Networks. Advances in Neural Information Processing Systems. 2012, 25.
[32] Krizhevsky, A., Sutskever, I., Hinton, G. E. ImageNet Classification with Deep Convolutional Neural Networks. Communications of the ACM. 2017, 60(6): 84-90.
[33] Isayev, O., Oses, C., Toher, C., Gossett, E., Curtarolo, S., Tropsha, A. Universal Fragment Descriptors for Predicting Properties of Inorganic Crystals. Nature Communications. 2017, 8(1): 15679.
[34] Ghiringhelli, L. M., Vybiral, J., Levchenko, S. V., Draxl, C., Scheffler, M. Big Data of Materials Science: Critical Role of the Descriptor. Physical Review Letters. 2015, 114(10): 105503.
[35] Pham, T. L., Kino, H., Terakura, K., Miyake, T., Tsuda, K., Takigawa, I., Dam, H. C. Machine Learning Reveals Orbital Interaction in Materials. Science and Technology of Advanced Materials. 2017, 18(1): 756.
[36] Schütt, K. T., Glawe, H., Brockherde, F., Sanna, A., Müller, K. R., Gross, E. K. How to Represent Crystal Structures for Machine Learning: Towards Fast Prediction of Electronic Properties. Physical Review B. 2014, 89(20): 205118.
[37] Seko, A., Hayashi, H., Nakayama, K., Takahashi, A., Tanaka, I. Representation of Compounds for Machine-learning Prediction of Physical Properties. Physical Review B. 2017, 95(14): 144110.
[38] Park, C. W., Wolverton, C. Developing An Improved Crystal Graph Convolutional Neural Network Framework for Accelerated Materials Discovery. Physical Review Materials. 2020, 4(6): 063801.
[39] Karamad, M., Magar, R., Shi, Y., Siahrostami, S., Gates, I. D., Farimani, A. B. Orbital Graph Convolutional Neural Network for Material Property Prediction. Physical Review Materials. 2020, 4(9): 093801.
[40] Lee, J., Asahi, R. Transfer Learning for Materials Informatics Using Crystal Graph Convolutional Neural Network. Computational Materials Science. 2021, 190: 110314.
[41] Chen, T., Kornblith, S., Norouzi, M., Hinton, G. A Simple Framework for Contrastive Learning of Visual Representations. In Proceedings of the 2020 International Conference on Machine Learning, Baltimore, USA, 2020; pp. 1597-1607.
[42] Lan, Z., Chen, M., Goodman, S., Gimpel, K., Sharma, P., Soricut, R. Albert: A Lite Bert for Self-supervised Learning of Language Representations. Arxiv Preprint. 2019, arXiv: 1909.11942.
[43] Magar, R., Wang, Y., Barati Farimani, A. Crystal Twins: Self-supervised Learning for Crystalline Material Property Prediction. npj Computational Materials. 2022, 8(1): 231.
[44] Kong, J. G., Li, Q. X., Li, J., Liu, Y., Zhu, J. J. Self-supervised graph neural networks for accurate prediction of N{\'e}el temperature. Chinese Physics Letters. 2022, 39(6): 067503.
[45] Kong, J. G., Li, J., Li, Q. X., Liu, Y., Zhang, R., Zhu, J. J., Chang, K. Self-supervised Representations and Node Embedding Graph Neural Networks for Accurate and Multi-scale Analysis of Materials. Arxiv Preprint. 2022, arXiv: 2211.03563.
[46] Sapoval, N., Aghazadeh, A., Nute, M. G., Antunes, D. A., Balaji, A., Baraniuk, R.,... Treangen, T. J. Current Progress and Open Challenges for Applying Deep Learning Across the Biosciences. Nature Communications. 2022, 13(1): 1728.
[47] Chen, X. W., Lin, X. Big Data Deep Learning: Challenges and Perspectives. IEEE Access. 2014, 2: 514-525.
[48] Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K. I., Jegelka, S. Representation Learning on Graphs with Jumping Knowledge Networks. In Proceedings of the 2018 International Conference on Machine Learning, Jinan, China, 2018; pp. 5453-5462.
Cite This Article
  • APA Style

    Zhao, K., Li, Q. (2024). Recent Advances and Applications of Graph Convolution Neural Network Methods in Materials Science. Advances in Applied Sciences, 9(2), 17-30. https://doi.org/10.11648/j.aas.20240902.11

    Copy | Download

    ACS Style

    Zhao, K.; Li, Q. Recent Advances and Applications of Graph Convolution Neural Network Methods in Materials Science. Adv. Appl. Sci. 2024, 9(2), 17-30. doi: 10.11648/j.aas.20240902.11

    Copy | Download

    AMA Style

    Zhao K, Li Q. Recent Advances and Applications of Graph Convolution Neural Network Methods in Materials Science. Adv Appl Sci. 2024;9(2):17-30. doi: 10.11648/j.aas.20240902.11

    Copy | Download

  • @article{10.11648/j.aas.20240902.11,
      author = {Ke-Lin Zhao and Qing-Xu Li},
      title = {Recent Advances and Applications of Graph Convolution Neural Network Methods in Materials Science
    },
      journal = {Advances in Applied Sciences},
      volume = {9},
      number = {2},
      pages = {17-30},
      doi = {10.11648/j.aas.20240902.11},
      url = {https://doi.org/10.11648/j.aas.20240902.11},
      eprint = {https://article.sciencepublishinggroup.com/pdf/10.11648.j.aas.20240902.11},
      abstract = {With the development of artificial intelligence (AI), AI plus science is increasingly valued, presenting new perspectives to scientific research. The research on using machine learning (including deep learning) to discover patterns from data and predict targeted material properties has received widespread attention, which will have a profound impact in material science studies. In recent years, there has been an increased interest in the use of deep learning in materials science, which has led to significant progress in both fundamental and applied research. One of the most notable advancements is the development of graph convolutional neural network models, which combine graph neural networks and convolutional neural networks to achieve outstanding results in materials science and bridge effectively the deep learning models and material properties predictions. The availability of large materials databases due to the rise of big data has further enhanced the relevance of these models in the field. We present, in this article, a comprehensive overview of graph convolutional neural network models, explaining their fundamental principles and highlighting a few examples of their applications in materials science, as well as current trends. The limitations and challenges that these models face, as well as the potential for future research in this dynamic area are also discussed.
    },
     year = {2024}
    }
    

    Copy | Download

  • TY  - JOUR
    T1  - Recent Advances and Applications of Graph Convolution Neural Network Methods in Materials Science
    
    AU  - Ke-Lin Zhao
    AU  - Qing-Xu Li
    Y1  - 2024/04/29
    PY  - 2024
    N1  - https://doi.org/10.11648/j.aas.20240902.11
    DO  - 10.11648/j.aas.20240902.11
    T2  - Advances in Applied Sciences
    JF  - Advances in Applied Sciences
    JO  - Advances in Applied Sciences
    SP  - 17
    EP  - 30
    PB  - Science Publishing Group
    SN  - 2575-1514
    UR  - https://doi.org/10.11648/j.aas.20240902.11
    AB  - With the development of artificial intelligence (AI), AI plus science is increasingly valued, presenting new perspectives to scientific research. The research on using machine learning (including deep learning) to discover patterns from data and predict targeted material properties has received widespread attention, which will have a profound impact in material science studies. In recent years, there has been an increased interest in the use of deep learning in materials science, which has led to significant progress in both fundamental and applied research. One of the most notable advancements is the development of graph convolutional neural network models, which combine graph neural networks and convolutional neural networks to achieve outstanding results in materials science and bridge effectively the deep learning models and material properties predictions. The availability of large materials databases due to the rise of big data has further enhanced the relevance of these models in the field. We present, in this article, a comprehensive overview of graph convolutional neural network models, explaining their fundamental principles and highlighting a few examples of their applications in materials science, as well as current trends. The limitations and challenges that these models face, as well as the potential for future research in this dynamic area are also discussed.
    
    VL  - 9
    IS  - 2
    ER  - 

    Copy | Download

Author Information
  • School of Science, Chongqing University of Posts and Telecommunications, Chongqing, China

  • School of Science, Chongqing University of Posts and Telecommunications, Chongqing, China; Institute for Advanced Sciences, Chongqing University of Posts and Telecommunications, Chongqing, China

  • Sections