A Caltech Library Service

FiniteNet: A Fully Convolutional LSTM Network Architecture for Time-Dependent Partial Differential Equations

Stevens, Ben and Colonius, Tim (2020) FiniteNet: A Fully Convolutional LSTM Network Architecture for Time-Dependent Partial Differential Equations. . (Unpublished)

[img] PDF - Submitted Version
See Usage Policy.


Use this Persistent URL to link to this item:


In this work, we present a machine learning approach for reducing the error when numerically solving time-dependent partial differential equations (PDE). We use a fully convolutional LSTM network to exploit the spatiotemporal dynamics of PDEs. The neural network serves to enhance finite-difference and finite-volume methods (FDM/FVM) that are commonly used to solve PDEs, allowing us to maintain guarantees on the order of convergence of our method. We train the network on simulation data, and show that our network can reduce error by a factor of 2 to 3 compared to the baseline algorithms. We demonstrate our method on three PDEs that each feature qualitatively different dynamics. We look at the linear advection equation, which propagates its initial conditions at a constant speed, the inviscid Burgers' equation, which develops shockwaves, and the Kuramoto-Sivashinsky (KS) equation, which is chaotic.

Item Type:Report or Paper (Discussion Paper)
Related URLs:
URLURL TypeDescription Paper
Stevens, Ben0000-0002-3410-5922
Colonius, Tim0000-0003-0326-3909
Additional Information:This material is based upon work supported by the National Science Foundation Graduate Research Fellowship under Grant No. 1745301.
Funding AgencyGrant Number
NSF Graduate Research FellowshipDGE-1745301
Record Number:CaltechAUTHORS:20211008-205456499
Persistent URL:
Usage Policy:No commercial reproduction, distribution, display or performance rights in this work are provided.
ID Code:111296
Deposited By: George Porter
Deposited On:08 Oct 2021 21:08
Last Modified:02 Jun 2023 01:02

Repository Staff Only: item control page