Change search
Refine search result
1 - 6 of 6
CiteExportLink to result list
Permanent link
Cite
Citation style
  • apa
  • harvard1
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Rows per page
  • 5
  • 10
  • 20
  • 50
  • 100
  • 250
Sort
  • Standard (Relevance)
  • Author A-Ö
  • Author Ö-A
  • Title A-Ö
  • Title Ö-A
  • Publication type A-Ö
  • Publication type Ö-A
  • Issued (Oldest first)
  • Issued (Newest first)
  • Created (Oldest first)
  • Created (Newest first)
  • Last updated (Oldest first)
  • Last updated (Newest first)
  • Disputation date (earliest first)
  • Disputation date (latest first)
  • Standard (Relevance)
  • Author A-Ö
  • Author Ö-A
  • Title A-Ö
  • Title Ö-A
  • Publication type A-Ö
  • Publication type Ö-A
  • Issued (Oldest first)
  • Issued (Newest first)
  • Created (Oldest first)
  • Created (Newest first)
  • Last updated (Oldest first)
  • Last updated (Newest first)
  • Disputation date (earliest first)
  • Disputation date (latest first)
Select
The maximal number of hits you can export is 250. When you want to export more records please use the Create feeds function.
  • 1.
    Chrysanthidis, Nikolaos
    et al.
    KTH, School of Electrical Engineering and Computer Science (EECS), Computational Science and Technology (CST). Aristotle University of Thessaloniki, Faculty of Engineering, School of Electrical and Computer Engineering, 54124, Thessaloniki, Greece.
    Fiebig, Florian
    KTH, School of Electrical Engineering and Computer Science (EECS), Computational Science and Technology (CST). Institute for Adaptive and Neural Computation, Edinburgh University, EH8 9AB Edinburgh, Scotland.
    Lansner, Anders
    KTH, School of Electrical Engineering and Computer Science (EECS), Computational Science and Technology (CST). Department of Numerical Analysis and Computer Science, Stockholm University, 10691 Stockholm, Sweden.
    Introducing double bouquet cells into a modular cortical associative memory modelManuscript (preprint) (Other academic)
    Abstract [en]

    We present an electrophysiological model of double bouquet cells and integrate them into an established cortical columnar microcircuit model that has previously been used as a spiking attractor model for memory. Learning in that model relies on a Bayesian-Hebbian learning rule to condition recurrent connectivity between pyramidal cells. We here demonstrate that the inclusion of a biophysically plausible double bouquet cell model can solve earlier concerns about learning rules that simultaneously learn excitation and inhibition and might thus violate Dale's Principle. We show that learning ability and resulting effective connectivity between functional columns of previous network models is preserved when pyramidal synapses onto double-bouquet cells are plastic under the same Hebbian-Bayesian learning rule. The proposed architecture draws on experimental evidence on double bouquet cells and effectively solves the problem of duplexed learning of inhibition and excitation by replacing recurrent inhibition between pyramidal cells in functional columns of different stimulus selectivity with a plastic disynaptic pathway. We thus show that the resulting change to the microcircuit architecture improves the model's biological plausibility without otherwise impacting the models spiking activity, basic operation, and learning abilities.

  • 2.
    Fiebig, Florian
    KTH, School of Electrical Engineering and Computer Science (EECS), Computational Science and Technology (CST).
    Active Memory Processing on Multiple Time-scales in Simulated Cortical Networks with Hebbian Plasticity2018Doctoral thesis, comprehensive summary (Other academic)
    Abstract [en]

    This thesis examines declarative memory function, and its underlying neural activity and mechanisms in simulated cortical networks. The included simulation models utilize and synthesize proposed universal computational principles of the brain, such as the modularity of cortical circuit organization, attractor network theory, and Hebbian synaptic plasticity, along with selected biophysical detail from the involved brain areas to implement functional models of known cortical memory systems. The models hypothesize relations between neural activity, brain area interactions, and cognitive memory functions such as sleep-dependent memory consolidation, or specific working memory tasks. In particular, this work addresses the acutely relevant research question if recently described fast forms of Hebbian synaptic plasticity are a possible mechanism behind working memory. The proposed models specifically challenge the “persistent activity hypothesis of working memory”, an established but increasingly questioned paradigm in working memory theory. The proposed alternative is a novel synaptic working memory model that is arguably more defensible than the existing paradigm as it can better explain memory function and important aspects of working memory-linked activity (such as the role of long-term memory in working memory tasks), while simultaneously matching experimental data from behavioral memory testing and important evidence from electrode recordings.

  • 3.
    Fiebig, Florian
    et al.
    KTH, School of Electrical Engineering and Computer Science (EECS), Computational Science and Technology (CST).
    Herman, Pawel
    KTH, School of Electrical Engineering and Computer Science (EECS), Computational Science and Technology (CST).
    Lansner, Anders
    KTH, School of Electrical Engineering and Computer Science (EECS), Computational Science and Technology (CST). Department of Mathematics, Stockholm University, 10691 Stockholm, Swed.
    An Indexing Theory for Working Memory based on Fast Hebbian PlasticityManuscript (preprint) (Other academic)
    Abstract [en]

    Working memory (WM) is a key component of human memory and cognitive function. Computational models have been used to uncover the underlying neural mechanisms. However, these studies have mostly focused on the short-term memory aspects of WM and neglected the equally important role of interactions between short- and long-term memory (STM, LTM). Here, we concentrate on these interactions within the framework of our new computational model of WM, which accounts for three cortical patches in macaque brain, corresponding to networks in prefrontal cortex (PFC) together with parieto-temporal cortical areas. In particular, we propose a cortical indexing theory that explains how PFC could associate, maintain and update multi-modal LTM representations. Our simulation results demonstrate how simultaneous, brief multi-modal memory cues could build a temporary joint memory representation linked via an "index" in the prefrontal cortex by means of fast Hebbian synaptic plasticity. The latter can then activate spontaneously and thereby reactivate the associated long-term representations. Cueing one long-term memory item rapidly pattern-completes the associated un-cued item via prefrontal cortex. The STM network updates flexibly as new stimuli arrive thereby gradually over-writing older representations. In a wider context, this WM model suggests a novel explanation for "variable binding", a long-standing and fundamental phenomenon in cognitive neuroscience, which is still poorly understood in terms of detailed neural mechanisms.

  • 4.
    Fiebig, Florian
    et al.
    KTH, School of Computer Science and Communication (CSC), Computational Science and Technology (CST). Edinburgh University, UK.
    Lansner, Anders
    KTH, School of Computer Science and Communication (CSC), Computational Science and Technology (CST). Stockholm University, Sweden.
    A Spiking Working Memory Model Based on Hebbian Short-Term Potentiation2017In: Journal of Neuroscience, ISSN 0270-6474, Vol. 37, no 1, p. 83-96Article in journal (Refereed)
    Abstract [en]

    A dominant theory of working memory (WM), referred to as the persistent activity hypothesis, holds that recurrently connected neural networks, presumably located in the prefrontal cortex, encode and maintain WM memory items through sustained elevated activity. Reexamination of experimental data has shown that prefrontal cortex activity in single units during delay periods is much more variable than predicted by such a theory and associated computational models. Alternative models of WM maintenance based on synaptic plasticity, such as short-term nonassociative (non-Hebbian) synaptic facilitation, have been suggested but cannot account for encoding of novel associations. Here we test the hypothesis that a recently identified fast-expressing form of Hebbian synaptic plasticity (associative short-term potentiation) is a possible mechanism for WM encoding and maintenance. Our simulations using a spiking neural network model of cortex reproduce a range of cognitive memory effects in the classical multi-item WM task of encoding and immediate free recall of word lists. Memory reactivation in the model occurs in discrete oscillatory bursts rather than as sustained activity. We relate dynamic network activity as well as key synaptic characteristics to electrophysiological measurements. Our findings support the hypothesis that fast Hebbian short-term potentiation is a key WM mechanism.

  • 5.
    Fiebig, Florian
    et al.
    KTH, School of Computer Science and Communication (CSC), Computational Biology, CB.
    Lansner, Anders
    KTH, School of Computer Science and Communication (CSC), Computational Biology, CB.
    Memory consolidation from seconds to weeks: a three-stage neural network model with autonomous reinstatement dynamics2014In: Frontiers in Computational Neuroscience, ISSN 1662-5188, E-ISSN 1662-5188, Vol. 8, p. 64-Article in journal (Refereed)
    Abstract [en]

    Declarative long-term memories are not created in an instant. Gradual stabilization and temporally shifting dependence of acquired declarative memories in different brain regions called systems consolidation- can be tracked in time by lesion experiments. The observation of temporally graded retrograde amnesia(RA) following hippocampal lesions points to a gradual transfer of memory from hippocampus to neocortical long-term memory. Spontaneous reactivations of hippocampal memories, asobserved in place cell reactivations during slow wave- sleep, are supposed to driven eocortical reinstatements and facilitate this process. We proposea functional neural network implementation of these ideas and further more suggest anextended three-state framework that includes the prefrontal cortex( PFC). It bridges the temporal chasm between working memory percepts on the scale of seconds and consolidated long-term memory on the scale of weeks or months. Wes how that our three-stage model can autonomously produce the necessary stochastic reactivation dynamics for successful episodic memory consolidation. There sulting learning system is shown to exhibit classical memory effects seen in experimental studies, such as retrograde and anterograde amnesia(AA) after simulated hippocampal lesioning; further more the model reproduces peculiar biological findings on memory modulation, such as retrograde facilitation of memory after suppressed acquisition of new longterm memories- similar to the effects of benzodiazepines on memory.

  • 6.
    Fiebig, Florian
    et al.
    KTH, School of Computer Science and Communication (CSC), Computational Biology, CB. Stockholm Univ, Sweden.
    Lansner, Anders
    KTH, School of Computer Science and Communication (CSC), Computational Biology, CB. Stockholm University, Sweden.
    Memory Consolidation from Seconds to Weeks Through Autonomous Reinstatement Dynamics in a Three-Stage Neural Network Model2015In: ADVANCES IN COGNITIVE NEURODYNAMICS (IV), SPRINGER , 2015, p. 47-53Conference paper (Refereed)
    Abstract [en]

    Long-term memories for facts and events are not created at an instant. Rather, memories stabilize gradually over time and involve various brain regions. The shifting dependence of acquired declarative memories on different brain regions - called systems consolidation - can be tracked in time by lesion experiments and has led to the development of the Complementary Learning Systems framework, which focuses on hippocampal-cortical interaction. Observations of temporally graded retrograde amnesia following hippocampal lesions, point to a gradual transfer from hippocampus to cortical long-term memory. Spontaneous reactivations of hippocampal memories, as observed in place cell reactivations during slow-wave-sleep, are supposed to drive cortical reinstatements and facilitate this process. We propose a functional neural network implementation of these ideas and furthermore suggest an extended three-stage framework that also includes the prefrontal cortex and bridges the temporal chasm between working memory percepts on the scale of seconds and consolidated long-term memory on the scale of weeks or months. We show that our three-stage model can autonomously produce the necessary stochastic reactivation dynamics for successful episodic memory consolidation. The resulting learning system is shown to exhibit classical memory effects seen in experimental studies, such as retrograde and anterograde amnesia after simulated hippocampal lesioning.

1 - 6 of 6
CiteExportLink to result list
Permanent link
Cite
Citation style
  • apa
  • harvard1
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf