A palimpsest memory based on an incremental Bayesian learning rule
2000 (English)In: Neurocomputing, ISSN 0925-2312, Vol. 32, 987-994 p.Article in journal (Refereed) Published
Capacity limited memory systems need to gradually forget old information in order to avoid catastrophic forgetting where all stored information is lost. This can be achieved by allowing new information to overwrite old, as in the so-called palimpsest memory. This paper describes a new such learning rule employed in an attractor neural network. The network does not exhibit catastrophic forgetting, has a capacity dependent on the learning time constant and exhibits recency effects in retrieval.
Place, publisher, year, edition, pages
2000. Vol. 32, 987-994 p.
Bayesian Confidence Propagation, palimpsest memory, associative memory, networks, model
Bioinformatics (Computational Biology)
IdentifiersURN: urn:nbn:se:kth:diva-19860DOI: 10.1016/S0925-2312(00)00270-8ISI: 000087897800131OAI: oai:DiVA.org:kth-19860DiVA: diva2:338552
The 8th Annual Computational Neuroscience Meeting (CNS'99); Pittsburgh, PA, USA; 18 July 1999 through 22 July 19992010-08-102010-08-102011-09-29Bibliographically approved