Change search
Refine search result
1 - 7 of 7
CiteExportLink to result list
Permanent link
Cite
Citation style
  • apa
  • harvard1
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Rows per page
  • 5
  • 10
  • 20
  • 50
  • 100
  • 250
Sort
  • Standard (Relevance)
  • Author A-Ö
  • Author Ö-A
  • Title A-Ö
  • Title Ö-A
  • Publication type A-Ö
  • Publication type Ö-A
  • Issued (Oldest first)
  • Issued (Newest first)
  • Created (Oldest first)
  • Created (Newest first)
  • Last updated (Oldest first)
  • Last updated (Newest first)
  • Disputation date (earliest first)
  • Disputation date (latest first)
  • Standard (Relevance)
  • Author A-Ö
  • Author Ö-A
  • Title A-Ö
  • Title Ö-A
  • Publication type A-Ö
  • Publication type Ö-A
  • Issued (Oldest first)
  • Issued (Newest first)
  • Created (Oldest first)
  • Created (Newest first)
  • Last updated (Oldest first)
  • Last updated (Newest first)
  • Disputation date (earliest first)
  • Disputation date (latest first)
Select
The maximal number of hits you can export is 250. When you want to export more records please use the Create feeds function.
  • 1.
    Benjaminsson, Simon
    et al.
    KTH, School of Computer Science and Communication (CSC), Computational Biology, CB.
    Lansner, Anders
    KTH, School of Computer Science and Communication (CSC), Computational Biology, CB.
    Nexa: A scalable neural simulator with integrated analysis2012In: Network, ISSN 0954-898X, E-ISSN 1361-6536, Vol. 23, no 4, p. 254-271Article in journal (Refereed)
    Abstract [en]

    Large-scale neural simulations encompass challenges in simulator design, data handling and understanding of simulation output. As the computational power of supercomputers and the size of network models increase, these challenges become even more pronounced. Here we introduce the experimental scalable neural simulator Nexa, for parallel simulation of large-scale neural network models at a high level of biological abstraction and for exploration of the simulation methods involved. It includes firing-rate models and capabilities to build networks using machine learning inspired methods for e. g. self-organization of network architecture and for structural plasticity. We show scalability up to the size of the largest machines currently available for a number of model scenarios. We further demonstrate simulator integration with online analysis and real-time visualization as scalable solutions for the data handling challenges.

  • 2. Crook, S. M.
    et al.
    Bednar, J. A.
    Berger, S.
    Cannon, R.
    Davison, A. P.
    Djurfeldt, Mikael
    KTH, School of Computer Science and Communication (CSC), Centres, Centre for High Performance Computing, PDC.
    Eppler, J.
    Kriener, B.
    Furber, S.
    Graham, B.
    Plesser, H. E.
    Schwabe, L.
    Smith, L.
    Steuber, V.
    Van Albada, S.
    Creating, documenting and sharing network models2012In: Network, ISSN 0954-898X, E-ISSN 1361-6536, Vol. 23, no 4, p. 131-149Article, review/survey (Refereed)
    Abstract [en]

    As computational neuroscience matures, many simulation environments are available that are useful for neuronal network modeling. However, methods for successfully documenting models for publication and for exchanging models and model components among these projects are still under development. Here we briefly review existing software and applications for network model creation, documentation and exchange. Then we discuss a few of the larger issues facing the field of computational neuroscience regarding network modeling and suggest solutions to some of these problems, concentrating in particular on standardized network model terminology, notation, and descriptions and explicit documentation of model scaling. We hope this will enable and encourage computational neuroscientists to share their models more systematically in the future.

  • 3.
    Fransén, Erik
    et al.
    KTH, Superseded Departments, Numerical Analysis and Computer Science, NADA.
    Lansner, Anders
    KTH, Superseded Departments, Numerical Analysis and Computer Science, NADA.
    A model of cortical associative memory based on a horizontal network of connected columns1998In: Network, ISSN 0954-898X, E-ISSN 1361-6536, Vol. 9, no 2, p. 235-264Article in journal (Refereed)
    Abstract [en]

    An attractor network model of cortical associative memory functions has been constructed and simulated. By replacing the single cell as the functional unit by multiple cells in cortical columns connected by long-range fibers, the model is improved in terms of correspondence with cortical connectivity. The connectivity is improved, since the original dense and symmetric connectivity of a standard recurrent network becomes sparse and asymmetric at the cell-to-cell level. Our simulations show that this kind of network, with model neurons of the Hodgkin-Huxley type arranged in columns, can operate as an associative memory in much the same way as previous models having simpler connectivity. The network shows attractor-like behaviour and performs the standard assembly operations despite differences in the dynamics introduced by the more detailed cell model and network structure. Furthermore, the model has become sufficiently detailed to allow evaluation against electrophysiological and anatomical observations. For instance, cell activities comply with experimental findings and reaction times are within biological and psychological ranges. By introducing a scaling model we demonstrate that a network approaching experimentally reported neuron numbers and synaptic distributions also could work like the model studied here.

  • 4.
    Lundqvist, Mikael
    et al.
    KTH, School of Computer Science and Communication (CSC), Computational Biology, CB.
    Rehn, Martin
    KTH, School of Computer Science and Communication (CSC), Computational Biology, CB.
    Djurfeldt, Mikael
    KTH, School of Computer Science and Communication (CSC), Computational Biology, CB.
    Lansner, Anders
    KTH, School of Computer Science and Communication (CSC), Computational Biology, CB.
    Attractor dynamics in a modular network model of neocortex2006In: Network, ISSN 0954-898X, E-ISSN 1361-6536, Network: Computation in Neural Systems, Vol. 17, no 3, p. 253-276Article in journal (Refereed)
    Abstract [en]

    Starting from the hypothesis that the mammalian neocortex to a first approximation functions as an associative memory of the attractor network type, we formulate a quantitative computational model of neocortical layers 2/3. The model employs biophysically detailed multi-compartmental model neurons with conductance based synapses and includes pyramidal cells and two types of inhibitory interneurons, i.e., regular spiking non-pyramidal cells and basket cells. The simulated network has a minicolumnar as well as a hypercolumnar modular structure and we propose that minicolumns rather than single cells are the basic computational units in neocortex. The minicolumns are represented in full scale and synaptic input to the different types of model neurons is carefully matched to reproduce experimentally measured values and to allow a quantitative reproduction of single cell recordings. Several key phenomena seen experimentally in vitro and in vivo appear as emergent features of this model. It exhibits a robust and fast attractor dynamics with pattern completion and pattern rivalry and it suggests an explanation for the so-called attentional blink phenomenon. During assembly dynamics, the model faithfully reproduces several features of local UP states, as they have been experimentally observed in vitro, as well as oscillatory behavior similar to that observed in the neocortex.

  • 5.
    Meli, Cristina
    et al.
    KTH, School of Computer Science and Communication (CSC), Computational Biology, CB.
    Lansner, Anders
    KTH, School of Computer Science and Communication (CSC), Computational Biology, CB.
    A modular attractor associative memory with patchy connectivity and weight pruning2013In: Network, ISSN 0954-898X, E-ISSN 1361-6536, Vol. 24, no 4, p. 129-150Article in journal (Refereed)
    Abstract [en]

    An important research topic in neuroscience is the study of mechanisms underlying memory and the estimation of the information capacity of the biological system. In this report we investigate the performance of a modular attractor network with recurrent connections similar to the cortical long-range connections extending in the horizontal direction. We considered a single learning rule, the BCPNN, which implements a kind of Hebbian learning and we trained the network with sparse random patterns. The storage capacity was measured experimentally for networks of size between 500 and 46 K units with a constant activity level, gradually diluting the connectivity. We show that the storage capacity of the modular network with patchy connectivity is comparable with the theoretical values estimated for simple associative memories and furthermore we introduce a new technique to prune the connectivity, which enhances the storage capacity up to the asymptotic value.

  • 6. Sandberg, A.
    et al.
    Tegner, J.
    Lansner, Anders
    KTH, Superseded Departments, Numerical Analysis and Computer Science, NADA.
    A working memory model based on fast Hebbian learning2003In: Network, ISSN 0954-898X, E-ISSN 1361-6536, Vol. 14, no 4, p. 789-802Article in journal (Refereed)
    Abstract [en]

    Recent models of the oculomotor delayed response task have been based on the assumption that working memory is stored as a persistent activity state (a 'bump' state). The delay activity is maintained by a finely tuned synaptic weight matrix producing a line attractor. Here we present an alternative hypothesis, that fast Hebbian synaptic plasticity is the mechanism underlying working memory. A computational model demonstrates a working memory function that is more resistant to distractors and network inhomogeneity compared to previous models, and that is also capable of storing multiple memories.

  • 7.
    Sandberg, Anders
    et al.
    KTH, Superseded Departments, Numerical Analysis and Computer Science, NADA.
    Lansner, Anders
    KTH, Superseded Departments, Numerical Analysis and Computer Science, NADA.
    Petersson, K. M.
    Ekeberg, Örjan
    KTH, Superseded Departments, Numerical Analysis and Computer Science, NADA.
    A Bayesian attractor network with incremental learning2002In: Network, ISSN 0954-898X, E-ISSN 1361-6536, Vol. 13, no 2, p. 179-194Article in journal (Refereed)
    Abstract [en]

    A realtime online learning system with capacity limits needs to gradually forget old information in order to avoid catastrophic forgetting. This can be achieved by allowing new information to overwrite old, as in a so-called palimpsest memory. This paper describes an incremental learning rule based on the Bayesian confidence propagation neural network that has palimpsest properties when employed in an attractor neural network. The network does not suffer from catastrophic forgetting, has a capacity dependent on the learning time constant and exhibits faster convergence for newer patterns.

1 - 7 of 7
CiteExportLink to result list
Permanent link
Cite
Citation style
  • apa
  • harvard1
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf