A Caltech Library Service

H^∞ Optimal Training Algorithms and their Relation to Backpropagation

Hassibi, Babak and Kailath, Thomas (1995) H^∞ Optimal Training Algorithms and their Relation to Backpropagation. In: Advances in Neural Information Processing Systems 7. MIT Press , Cambridge, MA, pp. 191-198. ISBN 0262201046.

[img] PDF - Published Version
See Usage Policy.

[img] PDF - Submitted Version
See Usage Policy.


Use this Persistent URL to link to this item:


We derive global H^∞ optimal training algorithms for neural networks. These algorithms guarantee the smallest possible prediction error energy over all possible disturbances of fixed energy, and are therefore robust with respect to model uncertainties and lack of statistical information on the exogenous signals. The ensuing estimators are infinite-dimensional, in the sense that updating the weight vector estimate requires knowledge of all previous weight esimates. A certain finite-dimensional approximation to these estimators is the backpropagation algorithm. This explains the local H6∞ optimality of backpropagation that has been previously demonstrated.

Item Type:Book Section
Related URLs:
URLURL TypeDescription
Additional Information:© 1995 Massachusetts Institute of Technology. This work was supported in part by the Air Force Office of Scientific Research, Air Force Systems Command under Contract AFOSR91-0060 and by the Army Research Office under contract DAAL03-89-K-0109.
Funding AgencyGrant Number
Air Force Office of Scientific Research (AFOSR), Air Force Systems CommandAFOSR91-0060
Army Research OfficeDAAL03-89-K-0109
Record Number:CaltechAUTHORS:20150218-074822311
Persistent URL:
Usage Policy:No commercial reproduction, distribution, display or performance rights in this work are provided.
ID Code:54919
Deposited By: Shirley Slattery
Deposited On:04 Mar 2015 19:13
Last Modified:03 Oct 2019 08:01

Repository Staff Only: item control page