A Caltech Library Service

Minimizing memory loss in learning a new environment

Al-Mashouq, Khalid and Abu-Mostafa, Yaser and Al-Ghoneim, Khaled (2001) Minimizing memory loss in learning a new environment. Neurocomputing, 38-40 . pp. 1051-1057. ISSN 0925-2312. doi:10.1016/s0925-2312(01)00400-3.

Full text is not posted in this repository. Consult Related URLs below.

Use this Persistent URL to link to this item:


Human and other living species can learn new concepts without losing the old ones. On the other hand, artificial neural networks tend to “forget” old concepts. In this paper, we present three methods to minimize the loss of the old information. These methods are analyzed and compared for the linear model. In particular, a method called network sampling is shown to be optimal under certain condition on the sampled data distribution. We also show how to apply these methods in the nonlinear models.

Item Type:Article
Related URLs:
URLURL TypeDescription
Additional Information:© 2001 Elsevier Science B.V. Available online 31 May 2001.
Subject Keywords:Memory loss; Catastrophic interference; Merging networks
Record Number:CaltechAUTHORS:20190702-153115049
Persistent URL:
Official Citation:Khalid Al-Mashouq, Yaser Abu-Mostafa, Khaled Al-Ghoneim, Minimizing memory loss in learning a new environment, Neurocomputing, Volumes 38–40, 2001, Pages 1051-1057, ISSN 0925-2312, (
Usage Policy:No commercial reproduction, distribution, display or performance rights in this work are provided.
ID Code:96897
Deposited By: Tony Diaz
Deposited On:08 Jul 2019 16:49
Last Modified:16 Nov 2021 17:24

Repository Staff Only: item control page