Published December 2020 | Version Supplemental Material + Published
Book Section - Chapter Open

Multipole Graph Neural Operator for Parametric Partial Differential Equations

Abstract

One of the main challenges in using deep learning-based methods for simulating physical systems and solving partial differential equations (PDEs) is formulating physics-based data in the desired structure for neural networks. Graph neural networks (GNNs) have gained popularity in this area since graphs offer a natural way of modeling particle interactions and provide a clear way of discretizing the continuum models. However, the graphs constructed for approximating such tasks usually ignore long-range interactions due to unfavorable scaling of the computational complexity with respect to the number of nodes. The errors due to these approximations scale with the discretization of the system, thereby not allowing for generalization under mesh-refinement. Inspired by the classical multipole methods, we purpose a novel multi-level graph neural network framework that captures interaction at all ranges with only linear complexity. Our multi-level formulation is equivalent to recursively adding inducing points to the kernel matrix, unifying GNNs with multi-resolution matrix factorization of the kernel. Experiments confirm our multi-graph network learns discretization-invariant solution operators to PDEs and can be evaluated in linear time.

Additional Information

Z. Li gratefully acknowledges the financial support from the Kortschak Scholars Program. A. Anandkumar is supported in part by Bren endowed chair, LwLL grants, Beyond Limits, Raytheon, Microsoft, Google, Adobe faculty fellowships, and DE Logi grant. K. Bhattacharya, N. B. Kovachki, B. Liu and A. M. Stuart gratefully acknowledge the financial support of the Army Research Laboratory through the Cooperative Agreement Number W911NF-12-0022. Research was sponsored by the Army Research Laboratory and was accomplished under Cooperative Agreement Number W911NF-12-2-0022. The views and conclusions contained in this document are those of the authors and should not be interpreted as representing the official policies, either expressed or implied, of the Army Research Laboratory or the U.S. Government. The U.S. Government is authorized to reproduce and distribute reprints for Government purposes notwithstanding any copyright notation herein.

Attached Files

Published - 4b21cf96d4cf612f239a6c322b10c8fe-Paper.pdf

Supplemental Material - 4b21cf96d4cf612f239a6c322b10c8fe-Supplemental.zip

Files

4b21cf96d4cf612f239a6c322b10c8fe-Paper.pdf

Files (1.8 MB)

Name Size Download all
md5:1a0b171d31b3e2f3e3ce0b9babf0bc0f
1.7 MB Preview Download
md5:e11a565315cbb436b2839fac0651cadd
135.7 kB Preview Download

Additional details

Identifiers

Eprint ID
106492
Resolver ID
CaltechAUTHORS:20201106-120222366

Related works

Funding

Kortschak Scholars Program
Bren Professor of Computing and Mathematical Sciences
Learning with Less Labels (LwLL)
Beyond Limits
Raytheon Company
Microsoft Faculty Fellowship
Google Faculty Research Award
Adobe
Caltech De Logi Fund
Army Research Laboratory
W911NF-12-0022

Dates

Created
2020-11-06
Created from EPrint's datestamp field
Updated
2023-06-02
Created from EPrint's last_modified field

Caltech Custom Metadata

Caltech groups
Center for Autonomous Systems and Technologies (CAST)