Published 1994
| Published
Book Section - Chapter
Open
H∞ Optimality Criteria for LMS and Backpropagation
- Creators
- Hassibi, Babak
- Sayed, Ali H.
- Kailath, Thomas
Abstract
We have recently shown that the widely known LMS algorithm is an H∞ optimal estimator. The H∞ criterion has been introduced, initially in the control theory literature, as a means to ensure robust performance in the face of model uncertainties and lack of statistical information on the exogenous signals. We extend here our analysis to the nonlinear setting often encountered in neural networks, and show that the backpropagation algorithm is locally H∞ optimal. This fact provides a theoretical justification of the widely observed excellent robustness properties of the LMS and backpropagation algorithms. We further discuss some implications of these results.
Additional Information
© 1994 Morgan Kaufmann. This work was supported in part by the Air Force Office of Scientific Research, Air Force Systems Command under Contract AFOSR91-0060 and in part by a grant from Rockwell International Inc.Attached Files
Published - Hoo_Optimality_Criteria_for_LMS_and_Backpropagation.pdf
Files
Hoo_Optimality_Criteria_for_LMS_and_Backpropagation.pdf
Files
(1.7 MB)
Name | Size | Download all |
---|---|---|
md5:5083296ae84fa49ffea7a6711d44af7b
|
1.7 MB | Preview Download |
Additional details
- Eprint ID
- 55444
- Resolver ID
- CaltechAUTHORS:20150302-170824137
- Air Force Office of Scientific Research (AFOSR), Air Force Systems Command
- AFOSR91-0060
- Rockwell International
- Created
-
2015-03-03Created from EPrint's datestamp field
- Updated
-
2019-10-03Created from EPrint's last_modified field