Published April 1993 | Version Published
Book Section - Chapter Open

Global descent replaces gradient descent to avoid local minima problem in learning with artificial neural networks

  • 1. ROR icon Jet Propulsion Lab

Abstract

One of the fundamental limitations of artificial neural network learning by gradient descent is the susceptibility to local minima during training. A new approach to learning is presented in which the gradient descent rule in the backpropagation learning algorithm is replaced with a novel global descent formalism. This methodology is based on a global optimization scheme, acronymed TRUST (terminal repeller unconstrained subenergy tunneling), which formulates optimization in terms of the flow of a special deterministic dynamical system. The ability of the new dynamical system to overcome local minima with common benchmark examples and a pattern recognition example is tested. The results demonstrate that the new method does indeed escape encountered local minima, and thus finds the global minimum solution to the specific problems.

Additional Information

© 1993 IEEE.

Attached Files

Published - 00298667.pdf

Files

00298667.pdf

Files (588.5 kB)

Name Size Download all
md5:41833e2ba37a499f42df6b1b51b44f3c
588.5 kB Preview Download

Additional details

Identifiers

Eprint ID
96317
Resolver ID
CaltechAUTHORS:20190612-101006020

Dates

Created
2019-06-12
Created from EPrint's datestamp field
Updated
2021-11-16
Created from EPrint's last_modified field