A Bayesian attractor network with incremental learning
2002 (English)In: Network, ISSN 0954-898X, E-ISSN 1361-6536, Vol. 13, no 2, 179-194 p.Article in journal (Refereed) Published
A realtime online learning system with capacity limits needs to gradually forget old information in order to avoid catastrophic forgetting. This can be achieved by allowing new information to overwrite old, as in a so-called palimpsest memory. This paper describes an incremental learning rule based on the Bayesian confidence propagation neural network that has palimpsest properties when employed in an attractor neural network. The network does not suffer from catastrophic forgetting, has a capacity dependent on the learning time constant and exhibits faster convergence for newer patterns.
Place, publisher, year, edition, pages
2002. Vol. 13, no 2, 179-194 p.
associative memory, neural networks, visual-cortex, hippocampal slices, potentiation, modulation, neurons, storage, specificity, palimpsests
Bioinformatics (Computational Biology)
IdentifiersURN: urn:nbn:se:kth:diva-21564DOI: 10.1088/0954-898X/13/2/302ISI: 000175773700002OAI: oai:DiVA.org:kth-21564DiVA: diva2:340262
QC 201005252010-08-102010-08-102011-09-29Bibliographically approved