Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • harvard1
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Spike-Based Bayesian-Hebbian Learning of Temporal Sequences
KTH, School of Computer Science and Communication (CSC), Computational Science and Technology (CST). Karolinska Inst, Sweden.ORCID iD: 0000-0001-8796-3237
KTH, School of Computer Science and Communication (CSC), Computational Science and Technology (CST). Karolinska Inst, Sweden.
KTH, School of Computer Science and Communication (CSC), Computational Science and Technology (CST). Karolinska Inst, Sweden.ORCID iD: 0000-0002-2358-7815
2016 (English)In: PloS Computational Biology, ISSN 1553-734X, E-ISSN 1553-7358, Vol. 12, no 5, e1004954Article in journal (Refereed) Published
Resource type
Text
Abstract [en]

Many cognitive and motor functions are enabled by the temporal representation and processing of stimuli, but it remains an open issue how neocortical microcircuits can reliably encode and replay such sequences of information. To better understand this, a modular attractor memory network is proposed in which meta-stable sequential attractor transitions are learned through changes to synaptic weights and intrinsic excitabilities via the spike-based Bayesian Confidence Propagation Neural Network (BCPNN) learning rule. We find that the formation of distributed memories, embodied by increased periods of firing in pools of excitatory neurons, together with asymmetrical associations between these distinct network states, can be acquired through plasticity. The model's feasibility is demonstrated using simulations of adaptive exponential integrate-and-fire model neurons (AdEx). We show that the learning and speed of sequence replay depends on a confluence of biophysically relevant parameters including stimulus duration, level of background noise, ratio of synaptic currents, and strengths of short-term depression and adaptation. Moreover, sequence elements are shown to flexibly participate multiple times in the sequence, suggesting that spiking attractor networks of this type can support an efficient combinatorial code. The model provides a principled approach towards understanding how multiple interacting plasticity mechanisms can coordinate hetero-associative learning in unison.

Place, publisher, year, edition, pages
2016. Vol. 12, no 5, e1004954
National Category
Computer and Information Science
Identifiers
URN: urn:nbn:se:kth:diva-190519DOI: 10.1371/journal.pcbi.1004954ISI: 000379348100041Scopus ID: 2-s2.0-84975865045OAI: oai:DiVA.org:kth-190519DiVA: diva2:953280
Funder
Swedish Research Council, VR-621-2012-3502VINNOVASwedish e‐Science Research CenterEU, FP7, Seventh Framework Programme, DFF - 1330-00226EU, FP7, Seventh Framework Programme, EU-FP7-FET-269921
Note

QC 20160817

Available from: 2016-08-17 Created: 2016-08-12 Last updated: 2017-04-19Bibliographically approved
In thesis
1. Spike-Based Bayesian-Hebbian Learning in Cortical and Subcortical Microcircuits
Open this publication in new window or tab >>Spike-Based Bayesian-Hebbian Learning in Cortical and Subcortical Microcircuits
2017 (English)Doctoral thesis, comprehensive summary (Other academic)
Abstract [en]

Cortical and subcortical microcircuits are continuously modified throughout life. Despite ongoing changes these networks stubbornly maintain their functions, which persist although destabilizing synaptic and nonsynaptic mechanisms should ostensibly propel them towards runaway excitation or quiescence. What dynamical phenomena exist to act together to balance such learning with information processing? What types of activity patterns

do they underpin, and how do these patterns relate to our perceptual experiences? What enables learning and memory operations to occur despite such massive and constant neural reorganization? Progress towards answering many of these questions can be pursued through large-scale neuronal simulations. 

 

In this thesis, a Hebbian learning rule for spiking neurons inspired by statistical inference is introduced. The spike-based version of the Bayesian Confidence Propagation Neural Network (BCPNN) learning rule involves changes in both synaptic strengths and intrinsic neuronal currents. The model is motivated by molecular cascades whose functional outcomes are mapped onto biological mechanisms such as Hebbian and homeostatic plasticity, neuromodulation, and intrinsic excitability. Temporally interacting memory traces enable spike-timing dependence, a stable learning regime that remains competitive, postsynaptic activity regulation, spike-based reinforcement learning and intrinsic graded persistent firing levels. 

 

The thesis seeks to demonstrate how multiple interacting plasticity mechanisms can coordinate reinforcement, auto- and hetero-associative learning within large-scale, spiking, plastic neuronal networks. Spiking neural networks can represent information in the form of probability distributions, and a biophysical realization of Bayesian computation can help reconcile disparate experimental observations.

Place, publisher, year, edition, pages
Stockholm: KTH Royal Institute of Technology, 2017. 89 p.
Series
TRITA-CSC-A, ISSN 1653-5723 ; 2017:11
Keyword
Bayes' rule, synaptic plasticity and memory modeling, intrinsic excitability, naïve Bayes classifier, spiking neural networks, Hebbian learning, neuromorphic engineering, reinforcement learning, temporal sequence learning, attractor network
National Category
Computer Systems
Research subject
Computer Science
Identifiers
urn:nbn:se:kth:diva-205568 (URN)978-91-7729-351-4 (ISBN)
Public defence
2017-05-09, F3, Lindstedtsvägen 26, Stockholm, 13:00 (English)
Opponent
Supervisors
Note

QC 20170421

Available from: 2017-04-21 Created: 2017-04-19 Last updated: 2017-04-21Bibliographically approved

Open Access in DiVA

No full text

Other links

Publisher's full textScopus

Search in DiVA

By author/editor
Tully, Philip J.Lindén, HenrikLansner, Anders
By organisation
Computational Science and Technology (CST)
In the same journal
PloS Computational Biology
Computer and Information Science

Search outside of DiVA

GoogleGoogle Scholar

Altmetric score

Total: 46 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • harvard1
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf