Tank, D. W. and Hopfield, J. J. (1987) Neural Computation by Concentrating Information in Time. Proceedings of the National Academy of Sciences of the United States of America, 84 (7). pp. 1896-1900. ISSN 0027-8424 http://resolver.caltech.edu/CaltechAUTHORS:TANpnas87
See Usage Policy.
Use this Persistent URL to link to this item: http://resolver.caltech.edu/CaltechAUTHORS:TANpnas87
An analog model neural network that can solve a general problem of recognizing patterns in a time-dependent signal is presented. The networks use a patterned set of delays to collectively focus stimulus sequence information to a neural state at a future time. The computational capabilities of the circuit are demonstrated on tasks somewhat similar to those necessary for the recognition of words in a continuous stream of speech. The network architecture can be understood from consideration of an energy function that is being minimized as the circuit computes. Neurobiological mechanisms are known for the generation of appropriate delays.
|Additional Information:||Copyright © 1987 by the National Academy of Sciences. Contributed by J. J. Hopfield, November 25, 1986. We thank S. E. Levinson, O. Ghitza, and F. Jelinek for helpful discussions on automatic speech recognition systems. The work at California Institute of Technology was supported by National Science Foundation Grant PCM-8406049. The publication costs of this article were defrayed in part by page charge payment. This article must therefore be hereby marked "advertisement" in accordance with 18 U.S.C. §1734 solely to indicate this fact.|
|Subject Keywords:||neural network; connectionist; speech recognition; parallel processing|
|Usage Policy:||No commercial reproduction, distribution, display or performance rights in this work are provided.|
|Deposited By:||Archive Administrator|
|Deposited On:||01 May 2008|
|Last Modified:||26 Dec 2012 09:59|
Repository Staff Only: item control page