Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Probabilistic associative learning suffices for learning the temporal structure of multiple sequences
KTH, School of Electrical Engineering and Computer Science (EECS), Computational Science and Technology (CST).
KTH, School of Electrical Engineering and Computer Science (EECS), Computational Science and Technology (CST).ORCID iD: 0000-0002-2358-7815
KTH, School of Electrical Engineering and Computer Science (EECS), Computational Science and Technology (CST).ORCID iD: 0000-0001-6553-823X
2019 (English)In: PLoS ONE, ISSN 1932-6203, E-ISSN 1932-6203, Vol. 14, no 8, article id e0220161Article in journal (Refereed) Published
Abstract [en]

From memorizing a musical tune to navigating a well known route, many of our underlying behaviors have a strong temporal component. While the mechanisms behind the sequential nature of the underlying brain activity are likely multifarious and multi-scale, in this work we attempt to characterize to what degree some of this properties can be explained as a consequence of simple associative learning. To this end, we employ a parsimonious firing-rate attractor network equipped with the Hebbian-like Bayesian Confidence Propagating Neural Network (BCPNN) learning rule relying on synaptic traces with asymmetric temporal characteristics. The proposed network model is able to encode and reproduce temporal aspects of the input, and offers internal control of the recall dynamics by gain modulation. We provide an analytical characterisation of the relationship between the structure of the weight matrix, the dynamical network parameters and the temporal aspects of sequence recall. We also present a computational study of the performance of the system under the effects of noise for an extensive region of the parameter space. Finally, we show how the inclusion of modularity in our network structure facilitates the learning and recall of multiple overlapping sequences even in a noisy regime.

Place, publisher, year, edition, pages
PUBLIC LIBRARY SCIENCE , 2019. Vol. 14, no 8, article id e0220161
National Category
Computational Mathematics
Identifiers
URN: urn:nbn:se:kth:diva-261334DOI: 10.1371/journal.pone.0220161ISI: 000484987900031PubMedID: 31369571Scopus ID: 2-s2.0-85070235217OAI: oai:DiVA.org:kth-261334DiVA, id: diva2:1358248
Note

QC 20191007

Available from: 2019-10-07 Created: 2019-10-07 Last updated: 2019-10-07Bibliographically approved

Open Access in DiVA

No full text in DiVA

Other links

Publisher's full textPubMedScopus

Authority records BETA

Martinez Mayorquin, Ramon HebertoLansner, AndersHerman, Pawel

Search in DiVA

By author/editor
Martinez Mayorquin, Ramon HebertoLansner, AndersHerman, Pawel
By organisation
Computational Science and Technology (CST)
In the same journal
PLoS ONE
Computational Mathematics

Search outside of DiVA

GoogleGoogle Scholar

doi
pubmed
urn-nbn

Altmetric score

doi
pubmed
urn-nbn
Total: 14 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf