Jimenez Rodriguez, Ivan Dario and Ames, Aaron D. and Yue, Yisong (2022) LyaNet: A Lyapunov Framework for Training Neural ODEs. . (Unpublished) https://resolver.caltech.edu/CaltechAUTHORS:20220224-200943137
![]() |
PDF
- Submitted Version
See Usage Policy. 1MB |
Use this Persistent URL to link to this item: https://resolver.caltech.edu/CaltechAUTHORS:20220224-200943137
Abstract
We propose a method for training ordinary differential equations by using a control-theoretic Lyapunov condition for stability. Our approach, called LyaNet, is based on a novel Lyapunov loss formulation that encourages the inference dynamics to converge quickly to the correct prediction. Theoretically, we show that minimizing Lyapunov loss guarantees exponential convergence to the correct solution and enables a novel robustness guarantee. We also provide practical algorithms, including one that avoids the cost of backpropagating through a solver or using the adjoint method. Relative to standard Neural ODE training, we empirically find that LyaNet can offer improved prediction performance, faster convergence of inference dynamics, and improved adversarial robustness. Our code available at https://github.com/ivandariojr/LyapunovLearning.
Item Type: | Report or Paper (Discussion Paper) | |||||||||
---|---|---|---|---|---|---|---|---|---|---|
Related URLs: |
| |||||||||
ORCID: |
| |||||||||
Record Number: | CaltechAUTHORS:20220224-200943137 | |||||||||
Persistent URL: | https://resolver.caltech.edu/CaltechAUTHORS:20220224-200943137 | |||||||||
Usage Policy: | No commercial reproduction, distribution, display or performance rights in this work are provided. | |||||||||
ID Code: | 113606 | |||||||||
Collection: | CaltechAUTHORS | |||||||||
Deposited By: | George Porter | |||||||||
Deposited On: | 25 Feb 2022 00:25 | |||||||||
Last Modified: | 25 Feb 2022 00:25 |
Repository Staff Only: item control page