Change search
ReferencesLink to record
Permanent link

Direct link
Implementing Plastic Weights in Neural Networks using Low Precision Arithmetic
KTH, School of Computer Science and Communication (CSC), Numerical Analysis and Computer Science, NADA. KTH, School of Computer Science and Communication (CSC), Computational Biology, CB.
KTH, School of Computer Science and Communication (CSC), Computational Biology, CB.
2009 (English)In: Neurocomputing, ISSN 0925-2312, Vol. 72, no 4-6, 968-972 p.Article in journal (Refereed) Published
Abstract [en]

In this letter, we develop a fixed-point arithmetic, low precision, implementation of an exponentially weighted moving average (EWMA) that is used in a neural network with plastic weights. We analyze the proposed design both analytically and experimentally, and we also evaluate its performance in the application of an attractor neural network. The EWMA in the proposed design has a constant relative truncation error, which is important for avoiding round-off errors in applications with slowly decaying processes, e.g. connectionist networks. We conclude that the proposed design offers greatly improved memory and computational efficiency compared to a naive implementation of the EWMA's difference equation, and that it is well suited for implementation in digital hardware.

Place, publisher, year, edition, pages
2009. Vol. 72, no 4-6, 968-972 p.
Keyword [en]
Exponentially weighted moving average, Fixed-point arithmetic, Leaky integrator, Low precision variables, Neural networks, Plastic weights
National Category
Computer Science
URN: urn:nbn:se:kth:diva-6240DOI: 10.1016/j.neucom.2008.04.007ISI: 000263372000030ScopusID: 2-s2.0-58149468833OAI: diva2:10892
2nd International Work-Conference on the Interplay Between Natural and Artificial Computation, La Manga del Mar Menor, SPAIN, JUN 18-21, 2007

QC 20100917. Uppdaterad från Submitted till Published (20100917)

QC 20150915

Available from: 2006-10-09 Created: 2006-10-09 Last updated: 2015-09-15Bibliographically approved
In thesis
1. An Attractor Memory Model of Neocortex
Open this publication in new window or tab >>An Attractor Memory Model of Neocortex
2006 (English)Doctoral thesis, comprehensive summary (Other scientific)
Abstract [en]

This thesis presents an abstract model of the mammalian neocortex. The model was constructed by taking a top-down view on the cortex, where it is assumed that cortex to a first approximation works as a system with attractor dynamics. The model deals with the processing of static inputs from the perspectives of biological mapping, algorithmic, and physical implementation, but it does not consider the temporal aspects of these inputs. The purpose of the model is twofold: Firstly, it is an abstract model of the cortex and as such it can be used to evaluate hypotheses about cortical function and structure. Secondly, it forms the basis of a general information processing system that may be implemented in computers. The characteristics of this model are studied both analytically and by simulation experiments, and we also discuss its parallel implementation on cluster computers as well as in digital hardware.

The basic design of the model is based on a thorough literature study of the mammalian cortex’s anatomy and physiology. We review both the layered and columnar structure of cortex and also the long- and short-range connectivity between neurons. Characteristics of cortex that defines its computational complexity such as the time-scales of cellular processes that transport ions in and out of neurons and give rise to electric signals are also investigated. In particular we study the size of cortex in terms of neuron and synapse numbers in five mammals; mouse, rat, cat, macaque, and human. The cortical model is implemented with a connectionist type of network where the functional units correspond to cortical minicolumns and these are in turn grouped into hypercolumn modules. The learning-rules used in the model are local in space and time, which make them biologically plausible and also allows for efficient parallel implementation. We study the implemented model both as a single- and multi-layered network. Instances of the model with sizes up to that of a rat-cortex equivalent are implemented and run on cluster computers in 23% of real time. We demonstrate on tasks involving image-data that the cortical model can be used for meaningful computations such as noise reduction, pattern completion, prototype extraction, hierarchical clustering, classification, and content addressable memory, and we show that also the largest cortex equivalent instances of the model can perform these types of computations. Important characteristics of the model are that it is insensitive to limited errors in the computational hardware and noise in the input data. Furthermore, it can learn from examples and is self-organizing to some extent. The proposed model contributes to the quest of understanding the cortex and it is also a first step towards a brain-inspired computing system that can be implemented in the molecular scale computers of tomorrow.

The main contributions of this thesis are: (i) A review of the size, modularization, and computational structure of the mammalian neocortex. (ii) An abstract generic connectionist network model of the mammalian cortex. (iii) A framework for a brain-inspired self-organizing information processing system. (iv) Theoretical work on the properties of the model when used as an autoassociative memory. (v) Theoretical insights on the anatomy and physiology of the cortex. (vi) Efficient implementation techniques and simulations of cortical sized instances. (vii) A fixed-point arithmetic implementation of the model that can be used in digital hardware.

Place, publisher, year, edition, pages
Stockholm: KTH, 2006. ix,148 p.
Trita-CSC-A, ISSN 1653-5723 ; 2006:14
Attractor Neural Networks, Cerebral Cortex, Neocortex, Brain Like Computing, Hypercolumns, Minicolumns, BCPNN, Parallel Computers, Autoassociative Memory
National Category
Computer Science
urn:nbn:se:kth:diva-4136 (URN)91-7178-461-6 (ISBN)
Public defence
2006-10-26, F2, Lindstedtsvägen 28, Stockholm, 10:15
QC 20100903Available from: 2006-10-09 Created: 2006-10-09 Last updated: 2010-09-03Bibliographically approved

Open Access in DiVA

No full text

Other links

Publisher's full textScopus

Search in DiVA

By author/editor
Johansson, ChristopherLansner, Anders
By organisation
Numerical Analysis and Computer Science, NADAComputational Biology, CB
In the same journal
Computer Science

Search outside of DiVA

GoogleGoogle Scholar
The number of downloads is the sum of all downloads of full texts. It may include eg previous versions that are now no longer available

Altmetric score

Total: 61 hits
ReferencesLink to record
Permanent link

Direct link