A Caltech Library Service

LyaNet: A Lyapunov Framework for Training Neural ODEs

Jimenez Rodriguez, Ivan Dario and Ames, Aaron D. and Yue, Yisong (2022) LyaNet: A Lyapunov Framework for Training Neural ODEs. Proceedings of Machine Learning Research . pp. 18687-18703. ISSN 2640-3498.

[img] PDF - Published Version
See Usage Policy.

[img] PDF (ArXiv discussion paper) - Submitted Version
See Usage Policy.


Use this Persistent URL to link to this item:


We propose a method for training ordinary differential equations by using a control-theoretic Lyapunov condition for stability. Our approach, called LyaNet, is based on a novel Lyapunov loss formulation that encourages the inference dynamics to converge quickly to the correct prediction. Theoretically, we show that minimizing Lyapunov loss guarantees exponential convergence to the correct solution and enables a novel robustness guarantee. We also provide practical algorithms, including one that avoids the cost of backpropagating through a solver or using the adjoint method. Relative to standard Neural ODE training, we empirically find that LyaNet can offer improved prediction performance, faster convergence of inference dynamics, and improved adversarial robustness. Our code is available at

Item Type:Article
Related URLs:
URLURL TypeDescription Paper ItemCode
Jimenez Rodriguez, Ivan Dario0000-0001-9065-5227
Ames, Aaron D.0000-0003-0848-3177
Yue, Yisong0000-0001-9127-1989
Additional Information:© 2022 by the author(s). We would like to thank Andrew Taylor, Jeremy Bernstein, Yujia Huang and Matt Levine for the insightful discussions. This project was funded in part by AeroVironment.
Funding AgencyGrant Number
Record Number:CaltechAUTHORS:20220714-224552040
Persistent URL:
Usage Policy:No commercial reproduction, distribution, display or performance rights in this work are provided.
ID Code:115590
Deposited By: George Porter
Deposited On:15 Jul 2022 23:24
Last Modified:15 Jul 2022 23:24

Repository Staff Only: item control page