Qiao, Zhuoran and Ding, Feizhi and Welborn, Matthew and Bygrave, Peter J. and Smith, Daniel G. A. and Anandkumar, Animashree and Manby, Frederick R. and Miller, Thomas F., III (2020) Multi-task learning for electronic structure to predict and explore molecular potential energy surfaces. In: 34th Conference on Neural Information Processing Systems. Advances in Neural Information Processing Systems. No.33. Neural Information Processing Systems Foundation , Red Hook, NY, pp. 1-23. https://resolver.caltech.edu/CaltechAUTHORS:20201203-151028849
![]() |
PDF
- Published Version
See Usage Policy. 3MB |
![]() |
PDF
- Accepted Version
See Usage Policy. 3MB |
Use this Persistent URL to link to this item: https://resolver.caltech.edu/CaltechAUTHORS:20201203-151028849
Abstract
We refine the OrbNet model to accurately predict energy, forces, and other response properties for molecules using a graph neural-network architecture based on features from low-cost approximated quantum operators in the symmetry-adapted atomic orbital basis. The model is end-to-end differentiable due to the derivation of analytic gradients for all electronic structure terms, and is shown to be transferable across chemical space due to the use of domain-specific features. The learning efficiency is improved by incorporating physically motivated constraints on the electronic structure through multi-task learning. The model outperforms existing methods on energy prediction tasks for the QM9 dataset and for molecular geometry optimizations on conformer datasets, at a computational cost that is thousand-fold or more reduced compared to conventional quantum-chemistry calculations (such as density functional theory) that offer similar accuracy.
Item Type: | Book Section | ||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|
Related URLs: |
| ||||||||||
ORCID: |
| ||||||||||
Additional Information: | © 2021 Neural Information Processing Systems Foundation. Z.Q. acknowledges the graduate research funding from Caltech. T.F.M. and A.A. acknowledge partial support from the Caltech DeLogi fund, and A.A. acknowledges support from a Caltech Bren professorship. The authors gratefully acknowledge NVIDIA, including Abe Stern and Tom Gibbs, for helpful discussions regarding GPU implementations of graph neural networks. | ||||||||||
Funders: |
| ||||||||||
Series Name: | Advances in Neural Information Processing Systems | ||||||||||
Issue or Number: | 33 | ||||||||||
DOI: | 10.48550/arXiv.2011.02680 | ||||||||||
Record Number: | CaltechAUTHORS:20201203-151028849 | ||||||||||
Persistent URL: | https://resolver.caltech.edu/CaltechAUTHORS:20201203-151028849 | ||||||||||
Usage Policy: | No commercial reproduction, distribution, display or performance rights in this work are provided. | ||||||||||
ID Code: | 106900 | ||||||||||
Collection: | CaltechAUTHORS | ||||||||||
Deposited By: | George Porter | ||||||||||
Deposited On: | 05 Dec 2020 01:51 | ||||||||||
Last Modified: | 02 Jun 2023 01:12 |
Repository Staff Only: item control page