CaltechAUTHORS
  A Caltech Library Service

A generalized convergence theorem for neural networks

Bruck, Jehoshua and Goodman, Joseph W. (1988) A generalized convergence theorem for neural networks. IEEE Transactions on Information Theory, 34 (5). pp. 1089-1092. ISSN 0018-9448. http://resolver.caltech.edu/CaltechAUTHORS:BRUieeetit88

[img]
Preview
PDF
See Usage Policy.

490Kb

Use this Persistent URL to link to this item: http://resolver.caltech.edu/CaltechAUTHORS:BRUieeetit88

Abstract

A neural network model is presented in which each neuron performs a threshold logic function. The model always converges to a stable state when operating in a serial mode and to a cycle of length at most 2 when operating in a fully parallel mode. This property is the basis for the potential applications of the model, such as associative memory devices and combinatorial optimization. The two convergence theorems (for serial and fully parallel modes of operation) are reviewed, and a general convergence theorem is presented that unifies the two known cases. New relations between the neural network model and the problem of finding a minimum cut in a graph are obtained.


Item Type:Article
Additional Information:© Copyright 1988 IEEE. Reprinted with permission. Manuscript received June 1987; revised November 1987. This work was supported in part by the Rothschild Foundation and by the U.S. Air Force Office of Scientific Research. This correspondence was presented in part at the IEEE First International Conference on Neural Networks, San Diego, CA, June 1987.
Record Number:CaltechAUTHORS:BRUieeetit88
Persistent URL:http://resolver.caltech.edu/CaltechAUTHORS:BRUieeetit88
Alternative URL:http://dx.doi.org/10.1109/18.21239
Usage Policy:No commercial reproduction, distribution, display or performance rights in this work are provided.
ID Code:5705
Collection:CaltechAUTHORS
Deposited By: Archive Administrator
Deposited On:29 Oct 2006
Last Modified:26 Dec 2012 09:14

Repository Staff Only: item control page