A Caltech Library Service

H∞ Optimality Criteria for LMS and Backpropagation

Hassibi, Babak and Sayed, Ali H. and Kailath, Thomas (1994) H∞ Optimality Criteria for LMS and Backpropagation. In: Advances in Neural Information Processing Systems 6 (NIPS 1993). Morgan Kaufmann , San Francisco, CA, pp. 351-358. ISBN 1558603220.

[img] PDF - Published Version
See Usage Policy.


Use this Persistent URL to link to this item:


We have recently shown that the widely known LMS algorithm is an H∞ optimal estimator. The H∞ criterion has been introduced, initially in the control theory literature, as a means to ensure robust performance in the face of model uncertainties and lack of statistical information on the exogenous signals. We extend here our analysis to the nonlinear setting often encountered in neural networks, and show that the backpropagation algorithm is locally H∞ optimal. This fact provides a theoretical justification of the widely observed excellent robustness properties of the LMS and backpropagation algorithms. We further discuss some implications of these results.

Item Type:Book Section
Related URLs:
URLURL TypeDescription
Additional Information:© 1994 Morgan Kaufmann. This work was supported in part by the Air Force Office of Scientific Research, Air Force Systems Command under Contract AFOSR91-0060 and in part by a grant from Rockwell International Inc.
Funding AgencyGrant Number
Air Force Office of Scientific Research (AFOSR), Air Force Systems CommandAFOSR91-0060
Rockwell InternationalUNSPECIFIED
Record Number:CaltechAUTHORS:20150302-170824137
Persistent URL:
Usage Policy:No commercial reproduction, distribution, display or performance rights in this work are provided.
ID Code:55444
Deposited By: Shirley Slattery
Deposited On:03 Mar 2015 02:03
Last Modified:03 Oct 2019 08:05

Repository Staff Only: item control page