Neural Operator: Graph Kernel Network for Partial Differential Equations
Abstract
The classical development of neural networks has been primarily for mappings between a finite-dimensional Euclidean space and a set of classes, or between two finite-dimensional Euclidean spaces. The purpose of this work is to generalize neural networks so that they can learn mappings between infinite-dimensional spaces (operators). The key innovation in our work is that a single set of network parameters, within a carefully designed network architecture, may be used to describe mappings between infinite-dimensional spaces and between different finite-dimensional approximations of those spaces. We formulate approximation of the infinite-dimensional mapping by composing nonlinear activation functions and a class of integral operators. The kernel integration is computed by message passing on graph networks. This approach has substantial practical consequences which we will illustrate in the context of mappings between input data to partial differential equations (PDEs) and their solutions. In this context, such learned networks can generalize among different approximation methods for the PDE (such as finite difference or finite element methods) and among approximations corresponding to different underlying levels of resolution and discretization. Experiments confirm that the proposed graph kernel network does have the desired properties and show competitive performance compared to the state of the art solvers.
Additional Information
Z. Li gratefully acknowledges the financial support from the Kortschak Scholars Program. K. Azizzadenesheli is supported in part by Raytheon and Amazon Web Service. A. Anandkumar is supported in part by Bren endowed chair, DARPA PAIHR00111890035, LwLL grants, Raytheon, Microsoft, Google, Adobe faculty fellowships, and DE Logi grant. K. Bhattacharya, N. B. Kovachki, B. Liu and A. M. Stuart gratefully acknowledge the financial support of the Amy research Laboratory through the Cooperative Agreement Number W911NF-12-0022. Research was sponsored by the Army Research Laboratory and was accomplished under Cooperative Agreement Number W911NF-12-2-0022. The views and conclusions contained in this document are those of the authors and should not be interpreted as representing the official policies, either expressed or implied, of the Army Research Laboratory or the U.S. Government. The U.S. Government is authorized to reproduce and distribute reprints for Government purposes notwithstanding any copyright notation herein.Attached Files
Submitted - 2003.03485.pdf
Files
Name | Size | Download all |
---|---|---|
md5:7196ad3431b6fe6ad72073114c091088
|
6.4 MB | Preview Download |
Additional details
- Eprint ID
- 102271
- Resolver ID
- CaltechAUTHORS:20200402-133318521
- Kortschak Scholars Program
- Raytheon
- Amazon Web Services
- Bren Professor of Computing and Mathematical Sciences
- Defense Advanced Research Projects Agency (DARPA)
- PAIHR00111890035
- Learning with Less Labels (LwLL)
- Microsoft Faculty Fellowship
- Google Faculty Research Award
- Adobe
- Caltech De Logi Fund
- Army Research Laboratory
- W911NF-12-0022
- Army Research Laboratory
- W911NF-12-2-002
- Created
-
2020-04-02Created from EPrint's datestamp field
- Updated
-
2023-06-02Created from EPrint's last_modified field