Conformal Crystal Graph Transformer with Robust Encoding of Periodic Invariance
DOI:
https://doi.org/10.1609/aaai.v38i1.27781Keywords:
APP: Natural Sciences, ML: Applications, ML: Graph-based Machine Learning, ML: Deep Learning AlgorithmsAbstract
Machine learning techniques, especially in the realm of materials design, hold immense promise in predicting the properties of crystal materials and aiding in the discovery of novel crystals with desirable traits. However, crystals possess unique geometric constraints—namely, E(3) invariance for primitive cell and periodic invariance—which need to be accurately reflected in crystal representations. Though past research has explored various construction techniques to preserve periodic invariance in crystal representations, their robustness remains inadequate. Furthermore, effectively capturing angular information within 3D crystal structures continues to pose a significant challenge for graph-based approaches. This study introduces novel solutions to these challenges. We first present a graph construction method that robustly encodes periodic invariance and a strategy to capture angular information in neural networks without compromising efficiency. We further introduce CrystalFormer, a pioneering graph transformer architecture that emphasizes angle preservation and enhances long-range information. Through comprehensive evaluation, we verify our model's superior performance in 5 crystal prediction tasks, reaffirming the efficiency of our proposed methods.Downloads
Published
2024-03-25
How to Cite
Wang, Y., Kong, S., Gregoire, J. M., & Gomes, C. P. (2024). Conformal Crystal Graph Transformer with Robust Encoding of Periodic Invariance. Proceedings of the AAAI Conference on Artificial Intelligence, 38(1), 283-291. https://doi.org/10.1609/aaai.v38i1.27781
Issue
Section
AAAI Technical Track on Application Domains