Change search
ReferencesLink to record
Permanent link

Direct link
Towards Cortex Isomorphic Attractor Neural Networks
KTH, Superseded Departments, Numerical Analysis and Computer Science, NADA.
2004 (English)Licentiate thesis, comprehensive summary (Other scientific)
Abstract [en]

In this thesis we model the mammalian cerebral cortex withattractor neural networks and study the parallelimplementations of these models.

First, we review the size, structure, and scaling laws ofthe cerebral cortex of five mammals; mouse, rat, cat, macaque,and human. Characteristics of the cortex such as time-scales,activity rates, and connectivity are also investigated. Basedon how the cortex is vertically structured and modularized, wepropose a generic model of cortex. In this model we make theassumption that the cortex to a first approximation operates asa fixed-point attractor memory. We review the field ofattractor neural networks and focus on a special type calledPotts neural networks.

Second, we implement the generic model of cortex with aBCPNN (Bayesian Confidence Propagating Neural Network). Thecortical BCPNN model is formulated as an attractor neuralnetwork and it is mainly used as an autoassociative memory.Based on the literature review and simulation experiments weanalyze the model with regard to storage capacity and scalingcharacteristics. The analysis of the model provides designprinciples and constraints for cortex sized attractor neuralnetworks.

Finally, we study parallel implementations of the BCPNN. Wediscuss the computational requirements of the cortical BCPNNmodel and some related issues. We analyze different levels ofparallelism in the BCPNN and the associated communicationrequirements. An important result is that the communicationwill not be limiting provided that we use spiking units(neurons). We take a closer look at how to implement the BCPNNon cluster-computers. Finally, we provide a brief review ofattractor neural networks implemented in hardware and onparallel computers.

The main contributions of this thesis are: (i) A review ofthe size, modularization, and computational structure of themammalian cerebral cortex; (ii) A generic neural network modelof the mammalian cortex; (iii) A thorough review of attractorneural networks and their properties; (iv) The computationalrequirements and constraints of a cortex sized BCPNN (v)Efficient implementation of large scale BCPNNs on parallelcomputers; (vi) A fixed-point arthmitic implementation of theBCPNN learning rule.

Keywords:Attractor Neural Networks, Cerebral Cortex,Minicolumns, Hypercolumns, Potts Neural Networks, BCPNN, andParallel Computers

Place, publisher, year, edition, pages
Stockholm: Numerisk analys och datalogi , 2004. , ix, 96 p.
Trita-NA, ISSN 0348-2952 ; 0415
Keyword [en]
Attractor Neural Networks, Cerebral Cortex, Minicolumns, Hypercolumns, Potts Neural Networks, BCPNN, and Parallel Computers
URN: urn:nbn:se:kth:diva-1811ISBN: 91-7283-810-8OAI: diva2:7817
Available from: 2004-08-17 Created: 2004-08-17 Last updated: 2012-03-20

Open Access in DiVA

No full text

Search in DiVA

By author/editor
Johansson, Christopher
By organisation
Numerical Analysis and Computer Science, NADA

Search outside of DiVA

GoogleGoogle Scholar
The number of downloads is the sum of all downloads of full texts. It may include eg previous versions that are now no longer available

Total: 84 hits
ReferencesLink to record
Permanent link

Direct link