CaltechAUTHORS
  A Caltech Library Service

Analysis Of Momentum Methods

Kovachki, Nikola B. and Stuart, Andrew M. (2019) Analysis Of Momentum Methods. . (Unpublished) https://resolver.caltech.edu/CaltechAUTHORS:20190722-102107649

[img] PDF - Submitted Version
See Usage Policy.

734Kb

Use this Persistent URL to link to this item: https://resolver.caltech.edu/CaltechAUTHORS:20190722-102107649

Abstract

Gradient decent-based optimization methods underpin the parameter training which results in the impressive results now found when testing neural networks. Introducing stochasticity is key to their success in practical problems, and there is some understanding of the role of stochastic gradient decent in this context. Momentum modifications of gradient decent such as Polyak's Heavy Ball method (HB) and Nesterov's method of accelerated gradients (NAG), are widely adopted. In this work, our focus is on understanding the role of momentum in the training of neural networks, concentrating on the common situation in which the momentum contribution is fixed at each step of the algorithm; to expose the ideas simply we work in the deterministic setting. We show that, contrary to popular belief, standard implementations of fixed momentum methods do no more than act to rescale the learning rate. We achieve this by showing that the momentum method converges to a gradient flow, with a momentum-dependent time-rescaling, using the method of modified equations from numerical analysis. Further we show that the momentum method admits an exponentially attractive invariant manifold on which the dynamic reduces to a gradient flow with respect to a modified loss function, equal to the original one plus a small perturbation.


Item Type:Report or Paper (Discussion Paper)
Related URLs:
URLURL TypeDescription
http://arxiv.org/abs/1906.04285arXivDiscussion Paper
ORCID:
AuthorORCID
Kovachki, Nikola B.0000-0002-3650-2972
Additional Information:Both authors are supported, in part, by the US National Science Foundation (NSF) grant DMS 1818977, the US Office of Naval Research (ONR) grant N00014-17-1-2079, and the US Army Research Office (ARO) grant W911NF-12-2-0022.
Funders:
Funding AgencyGrant Number
NSFDMS-1818977
Office of Naval Research (ONR)N00014-17-1-2079
Army Research Office (ARO)W911NF-12-2-0022
Subject Keywords:Optimization, Machine Learning, Deep Learning, Gradient Flows, Momentum Methods, Modified Equation, Invariant Manifold
Record Number:CaltechAUTHORS:20190722-102107649
Persistent URL:https://resolver.caltech.edu/CaltechAUTHORS:20190722-102107649
Usage Policy:No commercial reproduction, distribution, display or performance rights in this work are provided.
ID Code:97317
Collection:CaltechAUTHORS
Deposited By: Tony Diaz
Deposited On:22 Jul 2019 17:26
Last Modified:09 Mar 2020 13:18

Repository Staff Only: item control page