LMS and backpropagation are minimax filters
 Creators
 Hassibi, Babak
 Sayed, Ali H.
 Kailath, Thomas
Abstract
An important problem that arises in many applications is the following adaptive problem: given a sequence of n × 1 input column vectors {h_i}, and a corresponding sequence of desired scalar responses {d_i}, find an estimate of an n × 1 column vector of weights w such that the sum of squared errors, ∑^N_i=0 d_i−h^T_iw^2 , is minimized. The {h_i, d_i } are most often presented sequentially, and one is therefore required to find an adaptive scheme that recursively updates the estimate of w. The leastmeansquares (LMS) algorithm was originally conceived as an approximate solution to the above adaptive problem. It recursively updates the estimates of the weight vector along the direction of the instantaneous gradient of the sum squared error [1]. The introduction of the LMS adaptive filter in 1960 came as a significant development for a broad range of engineering applications since the LMS adaptive linearestimation procedure requires essentially no advance knowledge of the signal statistics. The LMS, however, has been long thought to be an approximate minimizing solution to the above squared error criterion, and a rigorous minimization criterion has been missing.
Additional Information
© 1994 Kluwer Academic.
Additional details
 Eprint ID
 55084
 DOI
 10.1007/9781461526964_12
 Resolver ID
 CaltechAUTHORS:20150223074318755
 Created

20150228Created from EPrint's datestamp field
 Updated

20211110Created from EPrint's last_modified field