Change search
Refine search result
12345 1 - 50 of 203
CiteExportLink to result list
Permanent link
Cite
Citation style
  • apa
  • harvard1
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Rows per page
  • 5
  • 10
  • 20
  • 50
  • 100
  • 250
Sort
  • Standard (Relevance)
  • Author A-Ö
  • Author Ö-A
  • Title A-Ö
  • Title Ö-A
  • Publication type A-Ö
  • Publication type Ö-A
  • Issued (Oldest first)
  • Issued (Newest first)
  • Created (Oldest first)
  • Created (Newest first)
  • Last updated (Oldest first)
  • Last updated (Newest first)
  • Disputation date (earliest first)
  • Disputation date (latest first)
  • Standard (Relevance)
  • Author A-Ö
  • Author Ö-A
  • Title A-Ö
  • Title Ö-A
  • Publication type A-Ö
  • Publication type Ö-A
  • Issued (Oldest first)
  • Issued (Newest first)
  • Created (Oldest first)
  • Created (Newest first)
  • Last updated (Oldest first)
  • Last updated (Newest first)
  • Disputation date (earliest first)
  • Disputation date (latest first)
Select
The maximal number of hits you can export is 250. When you want to export more records please use the Create feeds function.
  • 1. Aberg, A. C.
    et al.
    Thorstensson, A.
    Tarassova, O.
    Halvorsen, Kjartan
    KTH, School of Technology and Health (STH).
    Calculations of mechanisms for balance control during narrow and single-leg standing in fit older adults: A reliability study2011In: Gait & Posture, ISSN 0966-6362, E-ISSN 1879-2219, Vol. 34, no 3, p. 352-357Article in journal (Refereed)
    Abstract [en]

    For older people balance control in standing is critical for performance of activities of daily living without falling. The aims were to investigate reliability of quantification of the usage of the two balance mechanisms M(1) 'moving the centre of pressure' and M(2) 'Segment acceleration' and also to compare calculation methods based on a combination of kinetic (K) and kinematic (Km) data, (K-Km), or Km data only concerning M(2). For this purpose nine physically fit persons aged 70-78 years were tested in narrow and single-leg standing. Data were collected by a 7-camera motion capture system and two force plates. Repeated measure ANOVA and Tukey's post hoc tests were used to detect differences between the standing tasks. Reliability was estimated by ICCs, standard error of measurement including its 95% Cl, and minimal detectable change, whereas Pearson's correlation coefficient was used to investigate agreement between the two calculation methods. The results indicated that for the tasks investigated, M(1) and M(2) can be measured with acceptable inter- and intrasession reliability, and that both Km and K-Km based calculations may be useful for M(2), although Km data may give slightly lower values. The proportional M(1) :M(2) usage was approximately 9:1, in both anterio-posterior (AP) and medio-lateral (ML) directions for narrow standing, and about 2:1 in the AP and of 1:2 in the ML direction in single-leg standing, respectively. In conclusion, the tested measurements and calculations appear to constitute a reliable way of quantifying one important aspect of balance capacity in fit older people.

  • 2. Adori, Csaba
    et al.
    Barde, Swapnali
    Bogdanovic, Nenad
    Uhlén, Mathias
    KTH, School of Biotechnology (BIO), Proteomics and Nanobiotechnology. KTH, Centres, Science for Life Laboratory, SciLifeLab. Karolinska Institutet, Sweden.
    Reinscheid, Rainer R.
    Kovacs, Gabor G.
    Hokfelt, Tomas
    Neuropeptide S- and Neuropeptide S receptor-expressing neuron populations in the human pons2015In: Frontiers in Neuroanatomy, ISSN 1662-5129, E-ISSN 1662-5129, Vol. 9Article in journal (Refereed)
    Abstract [en]

    Neuropeptide S (NPS) is a regulatory peptide with potent pharmacological effects. In rodents, NPS is expressed in a few pontine cell clusters. Its receptor (NPSR1) is, however, widely distributed in the brain. The anxiolytic and arousal promoting effects of NPS make the NPS NPSR1 system an interesting potential drug target in mood-related disorders. However, so far possible disease-related mechanisms involving NPS have only been studied in rodents. To validate the relevance of these animal studies for i.a. drug development, we have explored the distribution of NPS-expressing neurons in the human pons using in situ hybridization and stereological methods and we compared the distribution of NPS mRNA expressing neurons in the human and rat brain. The calculation revealed a total number of 22,317 +/- 2411 NPS mRNA-positive neurons in human, bilaterally. The majority of cells (84%) were located in the parabrachial area in human: in the extension of the medial and lateral parabrachial nuclei, in the Kolliker-Fuse nucleus and around the adjacent lateral lemniscus. In human, in sharp contrast to the rodents, only very few NPS-positive cells (5%) were found close to the locus coeruleus. In addition, we identified a smaller cell cluster (11% of all NPS cells) in the pontine central gray matter both in human and rat, which has not been described previously even in rodents. We also examined the distribution of NPSR1 mRNA-expressing neurons in the human pons. These cells were mainly located in the rostral laterodorsal tegmental nucleus, the cuneiform nucleus, the microcellular tegmental nucleus region and in the periaqueductal gray. Our results show that both NPS and NPSR1 in the human pons are preferentially localized in regions of importance for integration of visceral autonomic information and emotional behavior. The reported interspecies differences must, however, be considered when looking for targets for new pharmacotherapeutical interventions.

  • 3. Adori, Csaba
    et al.
    Glueck, Laura
    Barde, Swapnali
    Yoshitake, Takashi
    Kovacs, Gabor G.
    Mulder, Jan
    Magloczky, Zsofia
    Havas, Laszlo
    Boelcskei, Kata
    Mitsios, Nicholas
    Uhlén, Mathias
    KTH, School of Biotechnology (BIO), Proteomics and Nanobiotechnology. KTH, Centres, Science for Life Laboratory, SciLifeLab.
    Szolcsanyi, Janos
    Kehr, Jan
    Ronnback, Annica
    Schwartz, Thue
    Rehfeld, Jens F.
    Harkany, Tibor
    Palkovits, Miklos
    Schulz, Stefan
    Hokfelt, Tomas
    Critical role of somatostatin receptor 2 in the vulnerability of the central noradrenergic system: new aspects on Alzheimer's disease2015In: Acta Neuropathologica, ISSN 0001-6322, E-ISSN 1432-0533, Vol. 129, no 4, p. 541-563Article in journal (Refereed)
    Abstract [en]

    Alzheimer's disease and other age-related neurodegenerative disorders are associated with deterioration of the noradrenergic locus coeruleus (LC), a probable trigger for mood and memory dysfunction. LC noradrenergic neurons exhibit particularly high levels of somatostatin binding sites. This is noteworthy since cortical and hypothalamic somatostatin content is reduced in neurodegenerative pathologies. Yet a possible role of a somatostatin signal deficit in the maintenance of noradrenergic projections remains unknown. Here, we deployed tissue microarrays, immunohistochemistry, quantitative morphometry and mRNA profiling in a cohort of Alzheimer's and age-matched control brains in combination with genetic models of somatostatin receptor deficiency to establish causality between defunct somatostatin signalling and noradrenergic neurodegeneration. In Alzheimer's disease, we found significantly reduced somatostatin protein expression in the temporal cortex, with aberrant clustering and bulging of tyrosine hydroxylase-immunoreactive afferents. As such, somatostatin receptor 2 (SSTR2) mRNA was highly expressed in the human LC, with its levels significantly decreasing from Braak stages III/IV and onwards, i.e., a process preceding advanced Alzheimer's pathology. The loss of SSTR2 transcripts in the LC neurons appeared selective, since tyrosine hydroxylase, dopamine beta-hydroxylase, galanin or galanin receptor 3 mRNAs remained unchanged. We modeled these pathogenic changes in Sstr2 (-/-) mice and, unlike in Sstr1 (-/-) or Sstr4 (-/-) genotypes, they showed selective, global and progressive degeneration of their central noradrenergic projections. However, neuronal perikarya in the LC were found intact until late adulthood (< 8 months) in Sstr2 (-/-) mice. In contrast, the noradrenergic neurons in the superior cervical ganglion lacked SSTR2 and, as expected, the sympathetic innervation of the head region did not show any signs of degeneration. Our results indicate that SSTR2-mediated signaling is integral to the maintenance of central noradrenergic projections at the system level, and that early loss of somatostatin receptor 2 function may be associated with the selective vulnerability of the noradrenergic system in Alzheimer's disease.

  • 4.
    Ahmed, Laeeq
    et al.
    KTH, School of Computer Science and Communication (CSC), Computational Science and Technology (CST).
    Edlund, Åke
    KTH, School of Computer Science and Communication (CSC), Computational Science and Technology (CST).
    Laure, Erwin
    KTH, School of Computer Science and Communication (CSC), Computational Science and Technology (CST).
    Whitmarsh, S.
    Parallel real time seizure detection in large EEG data2016In: IoTBD 2016 - Proceedings of the International Conference on Internet of Things and Big Data, SciTePress, 2016, p. 214-222Conference paper (Refereed)
    Abstract [en]

    Electroencephalography (EEG) is one of the main techniques for detecting and diagnosing epileptic seizures. Due to the large size of EEG data in long term clinical monitoring and the complex nature of epileptic seizures, seizure detection is both data-intensive and compute-intensive. Analysing EEG data for detecting seizures in real time has many applications, e.g., in automatic seizure detection or in allowing a timely alarm signal to be presented to the patient. In real time seizure detection, seizures have to be detected with negligible delay, thus requiring lightweight algorithms. MapReduce and its variations have been effectively used for data analysis in large dataset problems on general-purpose machines. In this study, we propose a parallel lightweight algorithm for epileptic seizure detection using Spark Streaming. Our algorithm not only classifies seizures in real time, it also learns an epileptic threshold in real time. We furthermore present "top-k amplitude measure" as a feature for classifying seizures in the EEG, that additionally assists in reducing data size. In a benchmark experiment we show that our algorithm can detect seizures in real time with low latency, while maintaining a good seizure detection rate. In short, our algorithm provides new possibilities in using private cloud infrastructures for real time epileptic seizure detection in EEG data.

  • 5. Ahmed, Omar Jamil
    et al.
    McFarland, James
    Kumar, Arvind
    Brown University, United States.
    Reactivation in ventral striatum during hippocampal ripples: evidence for the binding of reward and spatial memories?2008In: Journal of Neuroscience, ISSN 0270-6474, E-ISSN 1529-2401, Vol. 28, no 40, p. 9895-9897Article in journal (Refereed)
  • 6.
    Andersson, Sten
    et al.
    KTH, School of Computer Science and Communication (CSC), Computational Biology, CB.
    Petersson, Marcus E.
    KTH, School of Computer Science and Communication (CSC), Computational Biology, CB.
    Fransén, Erik
    KTH, School of Computer Science and Communication (CSC), Computational Biology, CB.
    Ionic mechanisms of action potential propagation velocity changes in peripheral C-fibers. Implications for pain2012In: BMC neuroscience (Online), ISSN 1471-2202, E-ISSN 1471-2202, Vol. 13, no Suppl 1, p. P138-Article in journal (Refereed)
  • 7.
    Asplund, Maria
    KTH, School of Technology and Health (STH), Neuronic Engineering.
    Conjugated Polymers for Neural Interfaces: Prospects, possibilities and future challenges2009Doctoral thesis, comprehensive summary (Other academic)
    Abstract [en]

    Within the field of neuroprosthetics the possibility to use implanted electrodes for communication with the nervous system is explored. Much effort is put into the material aspects of the electrode implant to increase charge injection capacity, suppress foreign body response and build micro sized electrode arrays allowing close contact with neurons. Conducting polymers, in particular poly(3,4-ethylene dioxythiophene) (PEDOT), have been suggested as materials highly interesting for such neural communication electrodes. The possibility to tailor the material both mechanically and biochemically to suit specific applications, is a substantial benefit with polymers when compared to metals. PEDOT also have hybrid charge transfer properties, including both electronic and ionic conduction, which allow for highly efficient charge injection.

     

    Part of this thesis describes a method of tailoring PEDOT through exchanging the counter ion used in electropolymerisation process. Commonly used surfactants can thereby be excluded and instead, different biomolecules can be incorporated into the polymer. The electrochemical characteristics of the polymer film depend on the ion. PEDOT electropolymerised with heparin was here determined to have the most advantageous properties. In vitro methods were applied to confirm non-cytotoxicity of the formed PEDOT:biomolecular composites. In addition, biocompatibility was affirmed for PEDOT:heparin by evaluation of inflammatory response and neuron density when implanted in rodent cortex.

     

    One advantage with PEDOT often stated, is its high stability compared to other conducting polymers. A battery of tests simulating the biological environment was therefore applied to investigate this stability, and especially the influence of the incorporated heparin. These tests showed that there was a decline in the electroactivity of PEDOT over time. This also applied in phosphate buffered saline at body temperature and in the absence of other stressors. The time course of degradation also differed depending on whether the counter ion was the surfactant polystyrene sulphonate or heparin, with a slightly better stability for the former.

     

    One possibility with PEDOT, often overlooked for biological applications, is the use of its semi conducting properties in order to include logic functions in the implant. This thesis presents the concept of using PEDOT electrochemical transistors to construct textile electrode arrays with in-built multiplexing. Using the electrolyte mediated interaction between adjacent PEDOT coated fibres to switch the polymer coat between conducting and non conducting states, then transistor function can be included in the conducting textile. Analogue circuit simulations based on experimentally found transistor characteristics proved the feasibility of these textile arrays. Developments of better polymer coatings, electrolytes and encapsulation techniques for this technology, were also identified to be essential steps in order to make these devices truly useful.

     

    In summary, this work shows the potential of PEDOT to improve neural interfaces in several ways. Some weaknesses of the polymer and the polymer electronics are presented and this, together with the epidemiological data, should point in the direction for future studies within this field.

  • 8.
    Auffarth, Benjamin
    KTH, School of Computer Science and Communication (CSC), Computational Biology, CB.
    Understanding smell: the olfactory stimulus problem2013In: Neuroscience and Biobehavioral Reviews, ISSN 0149-7634, E-ISSN 1873-7528, Vol. 37, no 8, p. 1667-1679Article, review/survey (Refereed)
    Abstract [en]

    The main problem with sensory processing is the difficulty in relating sensory input to physiological responses and perception. This is especially problematic at higher levels of processing, where complex cues elicit highly specific responses. In olfaction, this relationship is particularly obfuscated by the difficulty of characterizing stimulus statistics and perception. The core questions in olfaction are hence the so-called stimulus problem, which refers to the understanding of the stimulus, and the structure–activity and structure–odor relationships, which refer to the molecular basis of smell. It is widely accepted that the recognition of odorants by receptors is governed by the detection of physico-chemical properties and that the physical space is highly complex. Not surprisingly, ideas differ about how odor stimuli should be classified and about the very nature of information that the brain extracts from odors. Even though there are many measures for smell, there is none that accurately describes all aspects of it. Here, we summarize recent developments in the understanding of olfaction. We argue that an approach to olfactory function where information processing is emphasized could contribute to a high degree to our understanding of smell as a perceptual phenomenon emerging from neural computations. Further, we argue that combined analysis of the stimulus, biology, physiology, and behavior and perception can provide new insights into olfactory function. We hope that the reader can use this review as a competent guide and overview of research activities in olfactory physiology, psychophysics, computation, and psychology. We propose avenues for research, particularly in the systematic characterization of receptive fields and of perception.

  • 9.
    Auffarth, Benjamin
    et al.
    KTH, School of Computer Science and Communication (CSC), Computational Biology, CB.
    Kaplan, Bernhard
    KTH, School of Computer Science and Communication (CSC), Computational Biology, CB.
    Anders, Lansner
    KTH, School of Computer Science and Communication (CSC), Computational Biology, CB.
    Map formation in the olfactory bulb by axon guidance of olfactory neurons2011In: Frontiers in Systems Neuroscience, ISSN 1662-5137, E-ISSN 1662-5137, Vol. 5, no 0Article in journal (Refereed)
    Abstract [en]

    The organization of representations in the brain has been observed to locally reflect subspaces of inputs that are relevant to behavioral or perceptual feature combinations, such as in areas receptive to lower and higher-order features in the visual system. The early olfactory system developed highly plastic mechanisms and convergent evidence indicates that projections from primary neurons converge onto the glomerular level of the olfactory bulb (OB) to form a code composed of continuous spatial zones that are differentially active for particular physico?-chemical feature combinations, some of which are known to trigger behavioral responses. In a model study of the early human olfactory system, we derive a glomerular organization based on a set of real-world,biologically-relevant stimuli, a distribution of receptors that respond each to a set of odorants of similar ranges of molecular properties, and a mechanism of axon guidance based on activity. Apart from demonstrating activity-dependent glomeruli formation and reproducing the relationship of glomerular recruitment with concentration, it is shown that glomerular responses reflect similarities of human odor category perceptions and that further, a spatial code provides a better correlation than a distributed population code. These results are consistent with evidence of functional compartmentalization in the OB and could suggest a function for the bulb in encoding of perceptual dimensions.

  • 10.
    Ayoglu, Burcu
    et al.
    KTH, School of Biotechnology (BIO), Proteomics and Nanobiotechnology. KTH, Centres, Science for Life Laboratory, SciLifeLab.
    Mitsios, N.
    Khademi, M.
    Alfredsson, L.
    Uhlén, Mathias
    KTH, School of Biotechnology (BIO), Proteomics and Nanobiotechnology. KTH, Centres, Science for Life Laboratory, SciLifeLab.
    Mulder, J.
    Olsson, T.
    Schwenk, Jochen
    KTH, School of Biotechnology (BIO), Proteomics and Nanobiotechnology. KTH, Centres, Science for Life Laboratory, SciLifeLab.
    Nilsson, Peter
    KTH, School of Biotechnology (BIO), Proteomics and Nanobiotechnology. KTH, Centres, Science for Life Laboratory, SciLifeLab.
    Anoctamin 2, a novel autoimmune target candidate in multiple sclerosis2014In: Multiple Sclerosis, ISSN 1352-4585, E-ISSN 1477-0970, Vol. 20, p. 49-50Article in journal (Other academic)
  • 11.
    Bahuguna, Jyotika
    KTH, School of Computer Science and Communication (CSC), Computational Science and Technology (CST).
    Structure-Dynamics relationship in basalganglia: Implications for brain function2016Doctoral thesis, monograph (Other academic)
    Abstract [en]

    In this thesis, I have used a combination of computational models such as mean field and spikingnetwork simulations to study various sub-circuits of basal ganglia. I first studied the striatum(chapter 2), which is the input nucleus of basal ganglia. The two types of Medium SpinyNeurons (MSNs), D1 and D2-MSNs, together constitute 98% of the neurons in striatum. Thecomputational models so far have treated striatum as a homogenous unit and D1 and D2 MSNs asinterchangeable subpopulations. This implied that a bias in a Go/No-Go decision is enforced viaexternal agents to the striatum (eg. cortico-striatal weights), thereby assigning it a passive role.New data shows that there is an inherent asymmetry in striatal circuits. In this work, I showedthat striatum due to its asymmetric connectivity acts as a decision transition threshold devicefor the incoming cortical input. This has significant implications on the function of striatum asan active participant in influencing the bias towards a Go/No-Go decision. The striatal decisiontransition threshold also gives mechanistic explanations for phenomena such as L-Dopa InducedDyskinesia (LID), DBS-induced impulsivity, etc. In chapter 3, I extend the mean field model toinclude all the nuclei of basal ganglia to specifically study the role of two new subpopulationsfound in GPe (Globus Pallidus Externa). Recent work shows that GPe, also earlier consideredto be a homogenous nucleus, has at least two subpopulations which are dichotomous in theiractivity with respect to the cortical Slow Wave (SWA) and beta activity. Since the data for thesesubpopulations are missing, a parameter search was performed for effective connectivities usingGenetic Algorithms (GA) to fit the available experimental data. One major result of this studyis that there are various parameter combinations that meet the criteria and hence the presenceof functional homologs of the basal ganglia network for both pathological (PD) and healthynetworks is a possibility. Classifying all these homologous networks into clusters using somehigh level features of PD shows a large variance, hinting at the variance observed among the PDpatients as well as their response to the therapeutic measures. In chapter 4, I collaborated on aproject to model the role of STN and GPe burstiness for pathological beta oscillations as seenduring PD. During PD, the burstiness in the firing patterns of GPe and STN neurons are shownto increase. We found that in the baseline state, without any bursty neurons in GPe and STN,the GPe-STN network can transition to an oscillatory state through modulating the firing ratesof STN and GPe neurons. Whereas when GPe neurons are systematically replaced by burstyneurons, we found that increase in GPe burstiness enforces oscillations. An optimal % of burstyneurons in STN destroys oscillations in the GPe-STN network. Hence burstiness in STN mayserve as a compensatory mechanism to destroy oscillations. We also propose that bursting inGPe-STN could serve as a mechanism to initiate and kill oscillations on short time scales, asseen in the healthy state. The GPe-STN network however loses the ability to kill oscillations inthe pathological state.

  • 12.
    Bahuguna, Jyotika
    et al.
    KTH, School of Computer Science and Communication (CSC), Computational Biology, CB. University of Freiburg, Germany.
    Aertsen, Ad
    University of Freiburg, Germany.
    Kumar, Arvind
    KTH, School of Computer Science and Communication (CSC), Computational Biology, CB. University of Freiburg, Germany.
    Existence and control of Go/No-Go decision transition threshold in the striatum2015In: PloS Computational Biology, ISSN 1553-734X, E-ISSN 1553-7358, Vol. 11, no 4, article id e1004233Article in journal (Refereed)
    Abstract [en]

    A typical Go/No-Go decision is suggested to be implemented in the brain via the activation of the direct or indirect pathway in the basal ganglia. Medium spiny neurons (MSNs) in the striatum, receiving input from cortex and projecting to the direct and indirect pathways express D1 and D2 type dopamine receptors, respectively. Recently, it has become clear that the two types of MSNs markedly differ in their mutual and recurrent connectivities as well as feedforward inhibition from FSIs. Therefore, to understand striatal function in action selection, it is of key importance to identify the role of the distinct connectivities within and between the two types of MSNs on the balance of their activity. Here, we used both a reduced firing rate model and numerical simulations of a spiking network model of the striatum to analyze the dynamic balance of spiking activities in D1 and D2 MSNs. We show that the asymmetric connectivity of the two types of MSNs renders the striatum into a threshold device, indicating the state of cortical input rates and correlations by the relative activity rates of D1 and D2 MSNs. Next, we describe how this striatal threshold can be effectively modulated by the activity of fast spiking interneurons, by the dopamine level, and by the activity of the GPe via pallidostriatal backprojections. We show that multiple mechanisms exist in the basal ganglia for biasing striatal output in favour of either the `Go' or the `No-Go' pathway. This new understanding of striatal network dynamics provides novel insights into the putative role of the striatum in various behavioral deficits in patients with Parkinson's disease, including increased reaction times, L-Dopa-induced dyskinesia, and deep brain stimulation-induced impulsivity.

  • 13. Bartonek, Asa
    et al.
    Lidbeck, Cecilia M.
    Pettersson, Robert
    KTH, School of Engineering Sciences (SCI), Mechanics, Structural Mechanics.
    Weidenhielm, Eva Brostrom
    Eriksson, Marie
    Gutierrez-Farewik, Elena
    KTH, School of Engineering Sciences (SCI), Mechanics, Structural Mechanics.
    Influence of heel lifts during standing in children with motor disorders2011In: Gait & Posture, ISSN 0966-6362, E-ISSN 1879-2219, Vol. 34, no 3, p. 426-431Article in journal (Refereed)
    Abstract [en]

    Heel wedges may influence standing posture but how and to what extent are unknown. Thirty-two children with motor disorders - 16 with arthrogryposis multiplex congenita (AMC) and 16 with cerebral palsy (CP) - and 19 control children underwent a three-dimensional motion analysis. Unassisted standing during 20s with shoes only and with heel lifts of 10,20 and 30 mm heights was recorded in a randomized order. The more weight-bearing limb or the right limb was chosen for analysis. In both the AMC and CP groups, significant changes were seen between various heel lifts in ankle, knee and pelvis, and in the control group in the ankle only. Between orthosis and non-orthosis users significant differences were seen between different heel lift conditions in ankle, knee and trunk in the AMC group and in the ankle in the CP group. Pelvis position changed toward less anterior tilt with increasing heel height, but led to increasing knee flexion in most of the children, except for the AMC Non-Ort group. Children with AMC and CP represent different motor disorders, but the heel wedges had a similar influence on pelvis, hip and knee positions in all children with CP and in the AMC orthosis users. A challenge is to apply heel heights adequate to each individual's orthopaedic and neurologic conditions to improve biomechanical alignment with respect to all body segments.

  • 14.
    Bekkouche, Bo
    KTH, School of Computer Science and Communication (CSC), Computational Science and Technology (CST). KTH.
    Classification of Neuronal Subtypes in the Striatum and the Effect of Neuronal Heterogeneity on the Activity Dynamics2016Independent thesis Advanced level (degree of Master (Two Years)), 20 credits / 30 HE creditsStudent thesis
    Abstract [en]

    Clustering of single-cell RNA sequencing data is often used to show what states and subtypes cells have. Using this technique, striatal cells were clustered into subtypes using different clustering algorithms. Previously known subtypes were confirmed and new subtypes were found. One of them is a third medium spiny neuron subtype. Using the observed heterogeneity, as a second task, this project questions whether or not differences in individual neurons have an impact on the network dynamics. By clustering spiking activity from a neural network model, inconclusive results were found. Both algorithms indicating low heterogeneity, but by altering the quantity of a subtype between a low and high number, and clustering the network activity in each case, results indicate that there is an increase in the heterogeneity. This project shows a list of potential striatal subtypes and gives reasons to keep giving attention to biologically observed heterogeneity.

  • 15.
    Belic, Jovana
    KTH, School of Electrical Engineering and Computer Science (EECS), Computational Science and Technology (CST).
    Untangling Cortico-Striatal Circuitry and its Role in Health and Disease - A computational investigation2018Doctoral thesis, comprehensive summary (Other academic)
    Abstract [en]

    The basal ganglia (BG) play a critical role in a variety of regular motor and cognitive functions. Many brain diseases, such as Parkinson’s diseases, Huntington’s disease and dyskinesia, are directly related to malfunctions of the BG nuclei. One of those nuclei, the input nucleus called the striatum, is heavily connected to the cortex and receives afferents from nearly all cortical areas. The striatum is a recurrent inhibitory network that contains several distinct cell types. About 95% of neurons in the striatum are medium spiny neurons (MSNs) that form the only output from the striatum. Two of the most examined sources of GABAergic inhibition into MSNs are the feedback inhibition (FB) from the axon collaterals of the MSNs themselves, and the feedforward inhibition (FF) via the small population (1-2% of striatal neurons) of fast spiking interneurons (FSIs). The cortex sends direct projections to the striatum, while the striatum can affect the cortex only indirectly through other BG nuclei and the thalamus. Understanding how different components of the striatal network interact with each other and influence the striatal response to cortical inputs has crucial importance for clarifying the overall functions and dysfunctions of the BG.

        In this thesis I have employed advanced experimental data analysis techniques as well as computational modelling, to study the complex nature of cortico-striatal interactions. I found that for pathological states, such as Parkinson’s disease and L-DOPA-induced dyskinesia, effective connectivity is bidirectional with an accent on the striatal influence on the cortex. Interestingly, in the case of L-DOPA-induced dyskinesia, there was a high increase in effective connectivity at ~80 Hz and the results also showed a large relative decrease in the modulation of the local field potential amplitude (recorded in the primary motor cortex and sensorimotor striatum in awake, freely behaving, 6-OHDA lesioned hemi-parkinsonian rats) at ~80 Hz by the phase of low frequency oscillations. These results suggest a lack of coupling between the low frequency activity of a presumably larger neuronal population and the synchronized activity of a presumably smaller group of neurons active at 80 Hz.

        Next, I used a spiking neuron network model of the striatum to isolate the mechanisms underlying the transmission of cortical oscillations to the MSN population. I showed that FSIs play a crucial role in efficient propagation of cortical oscillations to the MSNs that did not receive direct cortical oscillations. Further, I have identified multiple factors such as the number of activated neurons, ongoing activity, connectivity, and synchronicity of inputs that influenced the transfer of oscillations by modifying the levels of FB and FF inhibitions. Overall, these findings reveal a new role of FSIs in modulating the transfer of information from the cortex to striatum. By modulating the activity and properties of the FSIs, striatal oscillations can be controlled very efficiently. Finally, I explored the interactions in the striatal network with different oscillation frequencies and showed that the features of those oscillations, such as amplitude and frequency fluctuations, can be influenced by a change in the input intensities into MSNs and FSIs and that these fluctuations are also highly dependent on the selected frequencies in addition to the phase offset between different cortical inputs.

        Lastly, I investigated how the striatum responds to cortical neuronal avalanches. Recordings in the striatum revealed that striatal activity was also characterized by spatiotemporal clusters that followed a power law distribution albeit, with significantly steeper slope. In this study, an abstract computational model was developed to elucidate the influence of intrastriatal inhibition and cortico-striatal interplay as important factors to understand the experimental findings. I showed that one particularly high activation threshold of striatal nodes can reproduce a power law-like distribution with a coefficient similar to the one found experimentally. By changing the ratio of excitation and inhibition in the cortical model, I saw that increased activity in the cortex strongly influenced striatal dynamics, which was reflected in a less negative slope of cluster size distributions in the striatum.  Finally, when inhibition was added to the model, cluster size distributions had a prominently earlier deviation from the power law distribution compared to the case when inhibition was not present. 

  • 16.
    Benjaminsson, Simon
    et al.
    KTH, School of Computer Science and Communication (CSC), Computational Biology, CB.
    Lansner, Anders
    KTH, School of Computer Science and Communication (CSC), Computational Biology, CB.
    Nexa: A scalable neural simulator with integrated analysis2012In: Network, ISSN 0954-898X, E-ISSN 1361-6536, Vol. 23, no 4, p. 254-271Article in journal (Refereed)
    Abstract [en]

    Large-scale neural simulations encompass challenges in simulator design, data handling and understanding of simulation output. As the computational power of supercomputers and the size of network models increase, these challenges become even more pronounced. Here we introduce the experimental scalable neural simulator Nexa, for parallel simulation of large-scale neural network models at a high level of biological abstraction and for exploration of the simulation methods involved. It includes firing-rate models and capabilities to build networks using machine learning inspired methods for e. g. self-organization of network architecture and for structural plasticity. We show scalability up to the size of the largest machines currently available for a number of model scenarios. We further demonstrate simulator integration with online analysis and real-time visualization as scalable solutions for the data handling challenges.

  • 17.
    Berthet, Pierre
    et al.
    KTH, School of Computer Science and Communication (CSC), Computational Science and Technology (CST). Karolinska Institute, Stockholm, Sweden.
    Lindahl, Mikael
    KTH, School of Computer Science and Communication (CSC), Computational Science and Technology (CST). Karolinska Institute, Stockholm, Sweden.
    Tully, Philip
    Hällgren Kotaleski, Jeanette
    KTH, School of Computer Science and Communication (CSC), Computational Science and Technology (CST). Karolinska Institute, Stockholm, Sweden.
    Lansner, Anders
    Functional Relevance of Different Basal Ganglia Pathways Investigated in a Spiking 1 Model with Reward Dependent PlasticityManuscript (preprint) (Other academic)
  • 18.
    Berthet, Pierre
    et al.
    KTH, School of Computer Science and Communication (CSC), Computational Biology, CB. Stockholm Univ, Sweden; Karolinska Inst, Sweden.
    Lindahl, Mikael
    KTH, School of Computer Science and Communication (CSC), Computational Biology, CB. Karolinska Inst, Sweden.
    Tully, Philip J.
    KTH, School of Computer Science and Communication (CSC), Computational Biology, CB. Karolinska Inst, Sweden; Univ Edinburgh, Scotland.
    Hellgren-Kotaleski, Jeanette
    KTH, School of Computer Science and Communication (CSC), Computational Biology, CB. Karolinska Inst, Sweden.
    Lansner, Anders
    KTH, School of Computer Science and Communication (CSC), Computational Biology, CB. Stockholm Univ, Sweden; Karolinska Inst, Sweden.
    Functional Relevance of Different Basal Ganglia Pathways Investigated in a Spiking Model with Reward Dependent Plasticity2016In: Frontiers in Neural Circuits, ISSN 1662-5110, E-ISSN 1662-5110, Vol. 10, article id 53Article in journal (Refereed)
    Abstract [en]

    The brain enables animals to behaviorally adapt in order to survive in a complex and dynamic environment, but how reward-oriented behaviors are achieved and computed by its underlying neural circuitry is an open question. To address this concern, we have developed a spiking model of the basal ganglia (BG) that learns to dis-inhibit the action leading to a reward despite ongoing changes in the reward schedule. The architecture of the network features the two pathways commonly described in BG, the direct (denoted D1) and the indirect (denoted D2) pathway, as well as a loop involving striatum and the dopaminergic system. The activity of these dopaminergic neurons conveys the reward prediction error (RPE), which determines the magnitude of synaptic plasticity within the different pathways. All plastic connections implement a versatile four-factor learning rule derived from Bayesian inference that depends upon pre- and post-synaptic activity, receptor type, and dopamine level. Synaptic weight updates occur in the D1 or D2 pathways depending on the sign of the RPE, and an efference copy informs upstream nuclei about the action selected. We demonstrate successful performance of the system in a multiple-choice learning task with a transiently changing reward schedule. We simulate lesioning of the various pathways and show that a condition without the D2 pathway fares worse than one without D1. Additionally, we simulate the degeneration observed in Parkinson's disease (PD) by decreasing the number of dopaminergic neurons during learning. The results suggest that the D1 pathway impairment in PD might have been overlooked. Furthermore, an analysis of the alterations in the synaptic weights shows that using the absolute reward value instead of the RPE leads to a larger change in D1.

  • 19.
    Bicanski, Andrej
    et al.
    School of Engineering, École Polytechnique Fédérale de Lausanne.
    Ryczko, Dimitri
    Département de Physiologie, Université de Montréa.
    Knuesel, Jérémie
    School of Engineering, École Polytechnique Fédérale de Lausanne.
    Harischandra, Nalin
    KTH, School of Computer Science and Communication (CSC), Computational Biology, CB.
    Charrier, Vanessa
    INSERM U862, Neurocentre Magendie, Université Bordeaux.
    Ekeberg, Örjan
    KTH, School of Computer Science and Communication (CSC), Computational Biology, CB.
    Cabelguen, Jean-Marie
    Neurocentre Magendie, Bordeaux University, Bordeaux Cedex, France.
    Ijspeert, Auke Jan
    School of Engineering, École Polytechnique Fédérale de Lausanne.
    Decoding the mechanisms of gait generation in salamanders by combining neurobiology, modeling and robotics2013In: Biological Cybernetics, ISSN 0340-1200, E-ISSN 1432-0770, Vol. 107, no 5, p. 545-564Article, review/survey (Refereed)
    Abstract [en]

    Vertebrate animals exhibit impressive locomotor skills. These locomotor skills are due to the complex interactions between the environment, the musculo-skeletal system and the central nervous system, in particular the spinal locomotor circuits. We are interested in decoding these interactions in the salamander, a key animal from an evolutionary point of view. It exhibits both swimming and stepping gaits and is faced with the problem of producing efficient propulsive forces using the same musculo-skeletal system in two environments with significant physical differences in density, viscosity and gravitational load. Yet its nervous system remains comparatively simple. Our approach is based on a combination of neurophysiological experiments, numerical modeling at different levels of abstraction, and robotic validation using an amphibious salamander-like robot. This article reviews the current state of our knowledge on salamander locomotion control, and presents how our approach has allowed us to obtain a first conceptual model of the salamander spinal locomotor networks. The model suggests that the salamander locomotor circuit can be seen as a lamprey-like circuit controlling axial movements of the trunk and tail, extended by specialized oscillatory centers controlling limb movements. The interplay between the two types of circuits determines the mode of locomotion under the influence of sensory feedback and descending drive, with stepping gaits at low drive, and swimming at high drive.

  • 20. Björkman, Eva
    et al.
    Zagal, Juan Cristobal
    Lindeberg, Tony
    KTH, Superseded Departments, Numerical Analysis and Computer Science, NADA.
    Roland, Per E.
    Evaluation of design options for the scale-space primal sketch analysis of brain activation images2000In: : HBM'00, published in Neuroimage, volume 11, number 5, 2000, 2000, Vol. 11, p. 656-656Conference paper (Refereed)
    Abstract [en]

    A key issue in brain imaging concerns how to detect the functionally activated regions from PET and fMRI images. In earlier work, it has been shown that the scale-space primal sketch provides a useful tool for such analysis [1]. The method includes presmoothing with different filter widths and automatic estimation of the spatial extent of the activated regions (blobs).

    The purpose is to present two modifications of the scale-space primal sketch, as well as a quantitative evaluation which shows that these modifications improve the performance, measured as the separation between blob descriptors extracted from PET images and from noise images. This separation is essential for future work of associating a statistical p-value with the scale-space blob descriptors.

  • 21.
    Blom, Hans
    et al.
    KTH, School of Engineering Sciences (SCI), Applied Physics. KTH, Centres, Science for Life Laboratory, SciLifeLab.
    Bernhem, Kristoffer
    KTH, School of Engineering Sciences (SCI), Applied Physics.
    Brismar, Hjalmar
    KTH, School of Engineering Sciences (SCI), Applied Physics. KTH, Centres, Science for Life Laboratory, SciLifeLab. Karolinska Institutet, Sweden.
    Sodium pump organization in dendritic spines2016In: NEUROPHOTONICS, ISSN 2329-423X, Vol. 3, no 4, article id 041803Article in journal (Refereed)
    Abstract [en]

    Advancement in fluorescence imaging with the invention of several super-resolution microscopy modalities (e.g., PALM/STORM and STED) has opened up the possibility of deciphering molecular distributions on the nanoscale. In our quest to better elucidate postsynaptic protein distribution in dendritic spines, we have applied these nanoscopy methods, where generated results could help improve our understanding of neuronal functions. In particular, we have investigated the principal energy transformer in the brain, i.e., the Na+; K+-ATPase (or sodium pump), an essential protein responsible for maintaining resting membrane potential and a major controller of intracellular ion homeostasis. In these investigations, we have focused on estimates of protein amount, giving assessments of how variations may depend on labeling strategies, sample analysis, and choice of nanoscopic imaging method, concluding that all can be critical factors for quantification. We present a comparison of these results and discuss the influences this may have for homeostatic sodium regulation in neurons and energy consumption.

  • 22.
    Blom, Hans
    et al.
    KTH, School of Engineering Sciences (SCI), Applied Physics, Experimental Biomolecular Physics.
    Ronnlund, Daniel
    KTH, School of Engineering Sciences (SCI), Applied Physics, Experimental Biomolecular Physics.
    Scott, Lena
    Spicarova, Zuzana
    Widengren, Jerker
    KTH, School of Engineering Sciences (SCI), Applied Physics, Experimental Biomolecular Physics.
    Bondar, Alexander
    Aperia, Anita
    Brismar, Hjalmar
    KTH, School of Engineering Sciences (SCI), Applied Physics, Cell Physics.
    Spatial distribution of Na+-K+-ATPase in dendritic spines dissected by nanoscale superresolution STED microscopy2011In: BMC neuroscience (Online), ISSN 1471-2202, E-ISSN 1471-2202, Vol. 12, p. 16-Article in journal (Refereed)
    Abstract [en]

    Background: The Na+,K+-ATPase plays an important role for ion homeostasis in virtually all mammalian cells, including neurons. Despite this, there is as yet little known about the isoform specific distribution in neurons. Results: With help of superresolving stimulated emission depletion microscopy the spatial distribution of Na+,K+-ATPase in dendritic spines of cultured striatum neurons have been dissected. The found compartmentalized distribution provides a strong evidence for the confinement of neuronal Na+,K+-ATPase (alpha 3 isoform) in the postsynaptic region of the spine. Conclusions: A compartmentalized distribution may have implications for the generation of local sodium gradients within the spine and for the structural and functional interaction between the sodium pump and other synaptic proteins. Superresolution microscopy has thus opened up a new perspective to elucidate the nature of the physiological function, regulation and signaling role of Na+,K+-ATPase from its topological distribution in dendritic spines.

  • 23.
    Brandi, Maya
    et al.
    KTH, School of Computer Science and Communication (CSC), Computational Biology, CB. KTH, School of Computer Science and Communication (CSC), Centres, Centre for High Performance Computing, PDC.
    Brocke, Ekaterina
    KTH, School of Computer Science and Communication (CSC), Computational Biology, CB.
    Talukdar, Husain A.
    KTH, School of Computer Science and Communication (CSC), Computational Biology, CB.
    Hanke, Michael
    KTH, School of Computer Science and Communication (CSC), Numerical Analysis, NA.
    Bhalla, Upinder S.
    National Centre for Biological Sciences, Bangalore, India.
    Hällgren-Kotaleski, Jeanette
    KTH, School of Computer Science and Communication (CSC), Computational Biology, CB.
    Djurfeldt, Mikael
    KTH, School of Computer Science and Communication (CSC), Centres, Centre for High Performance Computing, PDC.
    Connecting MOOSE and NeuroRD through MUSIC: towards a communication framework for multi-scale modeling2011In: Twentieth Annual Computational Neuroscience Meeting: CNS*2011, Springer Science+Business Media B.V., 2011Conference paper (Refereed)
  • 24. Brette, Romain
    et al.
    Rudolph, Michelle
    Carnevale, Ted
    Hines, Michael
    Beeman, David
    Bower, James M.
    Diesmann, Markus
    Morrison, Abigail
    Goodman, Philip H.
    Harris, Frederick C., Jr.
    Zirpe, Milind
    Natschlaeger, Thomas
    Pecevski, Dejan
    Ermentrout, Bard
    Djurfeldt, Mikael
    KTH, School of Computer Science and Communication (CSC), Computational Biology, CB.
    Lansner, Anders
    KTH, School of Computer Science and Communication (CSC), Computational Biology, CB.
    Rochel, Olivier
    Vieville, Thierry
    Muller, Eilif
    Davison, Andrew P.
    El Boustani, Sami
    Destexhe, Alain
    Simulation of networks of spiking neurons: A review of tools and strategies2007In: Journal of Computational Neuroscience, ISSN 0929-5313, E-ISSN 1573-6873, Vol. 23, no 3, p. 349-398Article, review/survey (Refereed)
    Abstract [en]

    We review different aspects of the simulation of spiking neural networks. We start by reviewing the different types of simulation strategies and algorithms that are currently implemented. We next review the precision of those simulation strategies, in particular in cases where plasticity depends on the exact timing of the spikes. We overview different simulators and simulation environments presently available (restricted to those freely available, open source and documented). For each simulation tool, its advantages and pitfalls are reviewed, with an aim to allow the reader to identify which simulator is appropriate for a given task. Finally, we provide a series of benchmark simulations of different types of networks of spiking neurons, including Hodgkin-Huxley type, integrate-and-fire models, interacting with current-based or conductance-based synapses, using clock-driven or event-driven integration strategies. The same set of models are implemented on the different simulators, and the codes are made available. The ultimate goal of this review is to provide a resource to facilitate identifying the appropriate integration strategy and simulation tool to use for a given modeling problem related to spiking neural networks.

  • 25.
    Brocke, Ekaterina
    et al.
    KTH, School of Computer Science and Communication (CSC), Computational Biology, CB.
    Djurfeldt, Mikael
    KTH, School of Computer Science and Communication (CSC), Centres, Centre for High Performance Computing, PDC.
    Efficient spike communication in the MUSIC multi-simulation framework2011In: Twentieth Annual Computational Neuroscience Meeting: CNS*2011, Springer Science+Business Media B.V., 2011Conference paper (Refereed)
  • 26.
    Broome, M.
    et al.
    Karolinska Institutet.
    Hokfelt, T.
    Terenius, L.
    Peptide YY (PYY)-immunoreactive neurons in the lower brain stem and spinal cord of rat1985Other (Refereed)
  • 27.
    Brusini, Irene
    et al.
    KTH, School of Engineering Sciences in Chemistry, Biotechnology and Health (CBH), Biomedical Engineering and Health Systems.
    Carneiro, Miguel
    Univ Porto, Ctr Invest Biodiversidade & Recursos Genet CIBIO, InBIO, P-4485661 Vairao, Portugal.;Univ Porto, Dept Biol, Fac Ciencias, P-4169007 Porto, Portugal..
    Wang, Chunliang
    KTH, School of Engineering Sciences in Chemistry, Biotechnology and Health (CBH), Biomedical Engineering and Health Systems.
    Rubin, Carl-Johan
    Uppsala Univ, Sci Life Lab Uppsala, Dept Med Biochem & Microbiol, S-75236 Uppsala, Sweden..
    Ring, Henrik
    Uppsala Univ, Dept Neurosci, S-75236 Uppsala, Sweden..
    Afonso, Sandra
    Univ Porto, Ctr Invest Biodiversidade & Recursos Genet CIBIO, InBIO, P-4485661 Vairao, Portugal..
    Blanco-Aguiar, Jose A.
    Univ Porto, Ctr Invest Biodiversidade & Recursos Genet CIBIO, InBIO, P-4485661 Vairao, Portugal.;CSIC, Inst Invest Recursos Cineget IREC, Ciudad Real 13005, Spain.;UCLM, CSIC, Ciudad Real 13005, Spain..
    Ferrand, Nuno
    Univ Porto, Ctr Invest Biodiversidade & Recursos Genet CIBIO, InBIO, P-4485661 Vairao, Portugal.;Univ Porto, Dept Biol, Fac Ciencias, P-4169007 Porto, Portugal.;Univ Johannesburg, Dept Zool, ZA-2006 Auckland Pk, South Africa..
    Rafati, Nima
    Uppsala Univ, Sci Life Lab Uppsala, Dept Med Biochem & Microbiol, S-75236 Uppsala, Sweden..
    Villafuerte, Rafael
    CSIC, IESA, Cordoba 14004, Spain..
    Smedby, Örjan
    KTH, School of Engineering Sciences in Chemistry, Biotechnology and Health (CBH), Biomedical Engineering and Health Systems.
    Damberg, Peter
    Karolinska Univ Hosp, Karolinska Expt Res & Imaging Ctr, S-17176 Solna, Sweden..
    Hallbook, Finn
    Uppsala Univ, Dept Neurosci, S-75236 Uppsala, Sweden..
    Fredrikson, Mats
    Uppsala Univ, Dept Psychol, S-75236 Uppsala, Sweden.;Karolinska Inst, Dept Clin Neurosci, S-17177 Stockholm, Sweden..
    Andersson, Leif
    Uppsala Univ, Sci Life Lab Uppsala, Dept Med Biochem & Microbiol, S-75236 Uppsala, Sweden.;Texas A&M Univ, Coll Vet Med & Biomed Sci, Dept Vet Integrat Biosci, College Stn, TX 77843 USA.;Swedish Univ Agr Sci, Dept Anim Breeding & Genet, S-75007 Uppsala, Sweden..
    Changes in brain architecture are consistent with altered fear processing in domestic rabbits2018In: Proceedings of the National Academy of Sciences of the United States of America, ISSN 0027-8424, E-ISSN 1091-6490, Vol. 115, no 28, p. 7380-7385Article in journal (Refereed)
    Abstract [en]

    The most characteristic feature of domestic animals is their change in behavior associated with selection for tameness. Here we show, using high-resolution brain magnetic resonance imaging in wild and domestic rabbits, that domestication reduced amygdala volume and enlarged medial prefrontal cortex volume, supporting that areas driving fear have lost volume while areas modulating negative affect have gained volume during domestication. In contrast to the localized gray matter alterations, white matter anisotropy was reduced in the corona radiata, corpus callosum, and the subcortical white matter. This suggests a compromised white matter structural integrity in projection and association fibers affecting both afferent and efferent neural flow, consistent with reduced neural processing. We propose that compared with their wild ancestors, domestic rabbits are less fearful and have an attenuated flight response because of these changes in brain architecture.

  • 28. Bujan, Alejandro
    et al.
    Aertsen, Ad
    Kumar, Arvind
    KTH, School of Computer Science and Communication (CSC), Computational Biology, CB. University of Freiburg, Freiburg, Germany .
    Role of Input Correlations in Shaping the Variability and Noise Correlations of Evoked Activity in the Neocortex2015In: Journal of Neuroscience, ISSN 0270-6474, E-ISSN 1529-2401, Vol. 35, no 22, p. 8611-8625Article in journal (Refereed)
    Abstract [en]

    Recent analysis of evoked activity recorded across different brain regions and tasks revealed a marked decrease in noise correlations and trial-by-trial variability. Given the importance of correlations and variability for information processing within the rate coding paradigm, several mechanisms have been proposed to explain the reduction in these quantities despite an increase in firing rates. These models suggest that anatomical clusters and/or tightly balanced excitation-inhibition can generate intrinsic network dynamics that may exhibit a reduction in noise correlations and trial-by-trial variability when perturbed by an external input. Such mechanisms based on the recurrent feedback crucially ignore the contribution of feedforward input to the statistics of the evoked activity. Therefore, we investigated how statistical properties of the feedforward input shape the statistics of the evoked activity. Specifically, we focused on the effect of input correlation structure on the noise correlations and trial-by-trial variability. We show that the ability of neurons to transfer the input firing rate, correlation, and variability to the output depends on the correlations within the presynaptic pool of a neuron, and that an input with even weak within-correlations can be sufficient to reduce noise correlations and trial-by-trial variability, without requiring any specific recurrent connectivity structure. In general, depending on the ongoing activity state, feedforward input could either increase or decrease noise correlation and trial-by-trial variability. Thus, we propose that evoked activity statistics are jointly determined by the feedforward and feedback inputs.

  • 29.
    Dahlgren, Anna
    et al.
    KTH, School of Technology and Health (STH), Patientsäkerhet.
    Van Leeuwen, W.
    Kircher, A.
    Lutzhoft, M.
    Barnett, M.
    Kecklund, G.
    Akerstedt, T.
    Sleep and fatigue in bridge officers working 6 h on and 6 h off - a simulator study2012In: Journal of Sleep Research, ISSN 0962-1105, E-ISSN 1365-2869, Vol. 21, p. 331-331Article in journal (Other academic)
  • 30. Dickson, C. T.
    et al.
    Magistretti, J.
    Shalinsky, M. H.
    Fransén, Erik
    KTH, Superseded Departments, Numerical Analysis and Computer Science, NADA.
    Hasselmo, M. E.
    Alonso, A.
    Properties and role of I-h in the pacing of subthreshold oscillations in entorhinal cortex layer II neurons2000In: Journal of Neurophysiology, ISSN 0022-3077, E-ISSN 1522-1598, Vol. 83, no 5, p. 2562-2579Article, review/survey (Refereed)
    Abstract [en]

    Various subsets of brain neurons express a hyperpolarization-activated inward current (I-h) that has been shown to be instrumental in pacing oscillatory activity at both a single-cell and a network level. A characteristic feature of the stellate cells (SCs) of entorhinal cortex (EC) layer II, those neurons giving rise to the main component of the perforant path input to the hippocampal formation, is their ability to generate persistent, Na+-dependent rhythmic subthreshold membrane potential oscillations, which are thought to be instrumental in implementing theta rhythmicity in the entorhinal-hippocampal network. The SCs also display a robust time-dependent inward rectification in the hyperpolarizing direction that may contribute to the generation of these oscillations. We performed whole cell recordings of SCs in in vitro slices to investigate the specific biophysical and pharmacological properties of the current underlying this inward rectification and to clarify its potential role in the genesis of the subthreshold oscillations. In voltage-clamp conditions, hyperpolarizing voltage steps evoked a slow, noninactivating inward current, which also deactivated slowly on depolarization. This current was identified as I-h because it was resistant to extracellular Ba2+, sensitive to Cs+, completely and selectively abolished by ZD7288, and carried by both Na+ and K+ ions. I-h in the SCs had an activation threshold and reversal potential at approximately -45 and -20 mV, respectively. Its half-activation voltage was -77 mV. Importantly, bath perfusion with ZD7288, but not Ba2+ gradually and completely abolished the subthreshold oscillations, thus directly implicating I-h in their generation. Using experimentally derived biophysical parameters for I-h and the low-threshold persistent Na+ current (I-NaP) present in the SCs, a simplified model of these neurons was constructed and their subthreshold electroresponsiveness simulated. This indicated that the interplay between I-NaP and I-h can sustain persistent subthreshold oscillations in SCs. I-NaP and I-h operate in a push-pull fashion where the delay in the activation/deactivation of I-h gives rise to the oscillatory process.

  • 31.
    Djurfeldt, Mikael
    KTH, School of Computer Science and Communication (CSC), Centres, Centre for High Performance Computing, PDC.
    The Connection-set Algebra: a formalism for therepresentation of connectivity structure inneuronal network models, implementations in Python and C++, and their use in simulators2011In: Twentieth Annual Computational Neuroscience Meeting:: CNS*2011, 2011Conference paper (Refereed)
  • 32.
    Djurfeldt, Mikael
    KTH, School of Computer Science and Communication (CSC), Centres, Centre for High Performance Computing, PDC.
    The Connection-set Algebra-A Novel Formalism for the Representation of Connectivity Structure in Neuronal Network Models2012In: Neuroinformatics, ISSN 1539-2791, E-ISSN 1559-0089, Vol. 10, no 3, p. 287-304Article in journal (Refereed)
    Abstract [en]

    The connection-set algebra (CSA) is a novel and general formalism for the description of connectivity in neuronal network models, from small-scale to large-scale structure. The algebra provides operators to form more complex sets of connections from simpler ones and also provides parameterization of such sets. CSA is expressive enough to describe a wide range of connection patterns, including multiple types of random and/or geometrically dependent connectivity, and can serve as a concise notation for network structure in scientific writing. CSA implementations allow for scalable and efficient representation of connectivity in parallel neuronal network simulators and could even allow for avoiding explicit representation of connections in computer memory. The expressiveness of CSA makes prototyping of network structure easy. A C+ + version of the algebra has been implemented and used in a large-scale neuronal network simulation (Djurfeldt et al., IBM J Res Dev 52(1/2):31-42, 2008b) and an implementation in Python has been publicly released.

  • 33.
    Djurfeldt, Mikael
    et al.
    KTH, School of Computer Science and Communication (CSC), Centres, Centre for High Performance Computing, PDC. nternational Neuroinformatics Coordinating Facility, Stockholm, Sweden .
    Davison, Andrew P.
    Eppler, Jochen M.
    Efficient generation of connectivity in neuronal networks from simulator-independent descriptions2014In: Frontiers in Neuroinformatics, ISSN 1662-5196, E-ISSN 1662-5196, Vol. 8, p. 43-Article in journal (Refereed)
    Abstract [en]

    Simulator-independent descriptions of connectivity in neuronal networks promise greater ease of model sharing, improved reproducibility of simulation results, and reduced programming effort for computational neuroscientists. However, until now, enabling the use of such descriptions in a given simulator in a computationally efficient way has entailed considerable work for simulator developers, which must be repeated for each new connectivity-generating library that is developed. We have developed a generic connection generator interface that provides a standard way to connect a connectivity-generating library to a simulator, such that one library can easily be replaced by another, according to the modeler's needs. We have used the connection generator interface to connect C++ and Python implementations of the previously described connection-set algebra to the NEST simulator. We also demonstrate how the simulator-independent modeling framework PyNN can transparently take advantage of this, passing a connection description through to the simulator layer for rapid processing in C++ where a simulator supports the connection generator interface and falling-back to slower iteration in Python otherwise. A set of benchmarks demonstrates the good performance of the interface.

  • 34.
    Djurfeldt, Mikael
    et al.
    KTH, School of Computer Science and Communication (CSC), Computational Biology, CB.
    Ekeberg, Örjan
    KTH, School of Computer Science and Communication (CSC), Computational Biology, CB.
    Lansner, Anders
    KTH, School of Computer Science and Communication (CSC), Computational Biology, CB.
    Large-scale modeling - a tool for conquering the complexity of the brain2008In: Frontiers in Neuroinformatics, ISSN 1662-5196, E-ISSN 1662-5196, Vol. 2, p. 1-4Article in journal (Refereed)
    Abstract [en]

    Is there any hope of achieving a thorough understanding of higher functions such as perception, memory, thought and emotion or is the stunning complexity of the brain a barrier which will limit such efforts for the foreseeable future? In this perspective we discuss methods to handle complexity, approaches to model building, and point to detailed large-scale models as a new contribution to the toolbox of the computational neuroscientist. We elucidate some aspects which distinguishes large-scale models and some of the technological challenges which they entail.

  • 35.
    Djurfeldt, Mikael
    et al.
    KTH, School of Computer Science and Communication (CSC), Computational Biology, CB.
    Lansner, Anders
    KTH, School of Computer Science and Communication (CSC), Computational Biology, CB.
    1st INCF Workshop on Large-scale Modeling of the Nervous System2007Report (Other academic)
  • 36.
    Djurfeldt, Mikael
    et al.
    KTH, School of Computer Science and Communication (CSC), Computational Biology, CB.
    Lansner, Anders
    KTH, School of Computer Science and Communication (CSC), Computational Biology, CB.
    Memory capacity in a model of cortical layers II/III2008Conference paper (Refereed)
  • 37. Donofrio, P.
    et al.
    Dahlgren, Anna
    KTH, School of Technology and Health (STH), Patientsäkerhet.
    Barnett, M.
    Lutzhoft, M.
    Kircher, A.
    Gillberg, M.
    Kecklund, G.
    Akerstedt, T.
    The effects of a 6 h on/6 h off maritime watch system on sleep2012In: Journal of Sleep Research, ISSN 0962-1105, E-ISSN 1365-2869, Vol. 21, p. 268-269Article in journal (Other academic)
  • 38. Du, K.
    et al.
    Wu, Y. -W
    Lindroos, R.
    Liu, Y.
    Rózsa, B.
    Katona, G.
    Ding, J. B.
    Hellgren Kotaleski, Jeanette
    KTH, Centres, Science for Life Laboratory, SciLifeLab. KTH, School of Computer Science and Communication (CSC). Stockholm Brain Institute, Karolinska Institute, 171 77 Solna, Sweden;bDepartment of Neuroscience, Karolinska Institute, 171 77 Solna.
    Cell-type–specific inhibition of the dendritic plateau potential in striatal spiny projection neurons2017In: Proceedings of the National Academy of Sciences of the United States of America, ISSN 0027-8424, E-ISSN 1091-6490, Vol. 114, no 36, p. E7612-E7621Article in journal (Refereed)
    Abstract [en]

    Striatal spiny projection neurons (SPNs) receive convergent excitatory synaptic inputs from the cortex and thalamus. Activation of spatially clustered and temporally synchronized excitatory inputs at the distal dendrites could trigger plateau potentials in SPNs. Such supralinear synaptic integration is crucial for dendritic computation. However, how plateau potentials interact with subsequent excitatory and inhibitory synaptic inputs remains unknown. By combining computational simulation, two-photon imaging, optogenetics, and dual-color uncaging of glutamate and GABA, we demonstrate that plateau potentials can broaden the spatiotemporal window for integrating excitatory inputs and promote spiking. The temporal window of spiking can be delicately controlled by GABAergic inhibition in a cell-type–specific manner. This subtle inhibitory control of plateau potential depends on the location and kinetics of the GABAergic inputs and is achieved by the balance between relief and reestablishment of NMDA receptor Mg2+ block. These findings represent a mechanism for controlling spatiotemporal synaptic integration in SPNs.

  • 39. Egorov, A. V.
    et al.
    Hamam, B. N.
    Fransén, Erik
    KTH, Superseded Departments, Numerical Analysis and Computer Science, NADA.
    Hasselmo, M. E.
    Alonso, A. A.
    Graded persistent activity in entorhinal cortex neurons2002In: Nature, ISSN 0028-0836, E-ISSN 1476-4687, Vol. 420, no 6912, p. 173-178Article in journal (Refereed)
    Abstract [en]

    Working memory represents the ability of the brain to hold externally or internally driven information for relatively short periods of time(1,2). Persistent neuronal activity is the elementary process underlying working memory but its cellular basis remains unknown. The most widely accepted hypothesis is that persistent activity is based on synaptic reverberations in recurrent circuits. The entorhinal cortex in the parahippocampal region is crucially involved in the acquisition, consolidation and retrieval of long-term memory traces for which working memory operations are essential(2). Here we show that individual neurons from layer V of the entorhinal cortex-which link the hippocampus to extensive cortical regions(3)-respond to consecutive stimuli with graded changes in firing frequency that remain stable after each stimulus presentation. In addition, the sustained levels of firing frequency can be either increased or decreased in an input-specific manner. This firing behaviour displays robustness to distractors; it is linked to cholinergic muscarinic receptor activation, and relies on activity-dependent changes of a Ca2+-sensitive cationic current. Such an intrinsic neuronal ability to generate graded persistent activity constitutes an elementary mechanism for working memory.

  • 40.
    Ekeberg, Örjan
    et al.
    KTH, School of Computer Science and Communication (CSC), Computational Science and Technology (CST).
    Fransén, Erik
    KTH, School of Computer Science and Communication (CSC), Computational Science and Technology (CST).
    Hellgren Kotaleski, Jeanette
    KTH, School of Computer Science and Communication (CSC), Computational Science and Technology (CST).
    Herman, Pawel
    KTH, School of Computer Science and Communication (CSC), Computational Science and Technology (CST).
    Kumar, Arvind
    KTH, School of Computer Science and Communication (CSC), Computational Science and Technology (CST).
    Lansner, Anders
    KTH, School of Computer Science and Communication (CSC), Computational Science and Technology (CST).
    Lindeberg, Tony
    KTH, School of Computer Science and Communication (CSC), Computational Science and Technology (CST).
    Computational Brain Science at CST, CSC, KTH2016Other (Other academic)
    Abstract [en]

    Mission and Vision - Computational Brain Science Lab at CST, CSC, KTH

    The scientific mission of the Computational Brain Science Lab at CSC is to be at the forefront of mathematical modelling, quantitative analysis and mechanistic understanding of brain function. We perform research on (i) computational modelling of biological brain function and on (ii) developing theory, algorithms and software for building computer systems that can perform brain-like functions. Our research answers scientific questions and develops methods in these fields. We integrate results from our science-driven brain research into our work on brain-like algorithms and likewise use theoretical results about artificial brain-like functions as hypotheses for biological brain research.

    Our research on biological brain function includes sensory perception (vision, hearing, olfaction, pain), cognition (action selection, memory, learning) and motor control at different levels of biological detail (molecular, cellular, network) and mathematical/functional description. Methods development for investigating biological brain function and its dynamics as well as dysfunction comprises biomechanical simulation engines for locomotion and voice, machine learning methods for analysing functional brain images, craniofacial morphology and neuronal multi-scale simulations. Projects are conducted in close collaborations with Karolinska Institutet and Karolinska Hospital in Sweden as well as other laboratories in Europe, U.S., Japan and India.

    Our research on brain-like computing concerns methods development for perceptual systems that extract information from sensory signals (images, video and audio), analysis of functional brain images and EEG data, learning for autonomous agents as well as development of computational architectures (both software and hardware) for neural information processing. Our brain-inspired approach to computing also applies more generically to other computer science problems such as pattern recognition, data analysis and intelligent systems. Recent industrial collaborations include analysis of patient brain data with MentisCura and the startup company 13 Lab bought by Facebook.

    Our long term vision is to contribute to (i) deeper understanding of the computational mechanisms underlying biological brain function and (ii) better theories, methods and algorithms for perceptual and intelligent systems that perform artificial brain-like functions by (iii) performing interdisciplinary and cross-fertilizing research on both biological and artificial brain-like functions. 

    On one hand, biological brains provide existence proofs for guiding our research on artificial perceptual and intelligent systems. On the other hand, applying Richard Feynman’s famous statement ”What I cannot create I do not understand” to brain science implies that we can only claim to fully understand the computational mechanisms underlying biological brain function if we can build and implement corresponding computational mechanisms on a computerized system that performs similar brain-like functions.

  • 41.
    Eriksson, David
    et al.
    KTH, Superseded Departments, Numerical Analysis and Computer Science, NADA.
    Fransén, Erik
    KTH, Superseded Departments, Numerical Analysis and Computer Science, NADA.
    Zilberter, Y.
    Lansner, Anders
    KTH, Superseded Departments, Numerical Analysis and Computer Science, NADA.
    Effects of short-term synaptic plasticity in a local microcircuit on cell firing2003In: Neurocomputing, ISSN 0925-2312, E-ISSN 1872-8286, Vol. 52-54, p. 7-12Article in journal (Refereed)
    Abstract [en]

    Effects of short-term synaptic plasticity on cell firing properties in a microcircuit formed by a reciprocally connected pyramidal cell and FSN interneuron in layer 2/3 of neocortex were analyzed in a biophysical model. Induction of synaptic depression by backpropagating dendritic action potentials was replicated, as well as the resulting time dependent depression of IPSP amplitudes. Results indicate that the effect of the depression becomes significant above 30 Hz input frequency. The magnitude of the effect depends on the time constant of the dendritic calcium regulating the depression. The frequency range depends on the time constant of the IPSP.

  • 42. Eriksson, Johan
    et al.
    Vogel, Edward K.
    Lansner, Anders B.
    KTH, School of Computer Science and Communication (CSC), Computational Biology, CB. Department of Numerical Analysis and Computer Science, Stockholm University, Sweden.
    Bergstrom, Fredrik
    Nyberg, Lars
    Neurocognitive Architecture of Working Memory2015In: Neuron, ISSN 0896-6273, E-ISSN 1097-4199, Vol. 88, no 1, p. 33-46Article, review/survey (Refereed)
    Abstract [en]

    A crucial role for working memory in temporary information processing and guidance of complex behavior has been recognized for many decades. There is emerging consensus that working-memory maintenance results from the interactions among long-term memory representations and basic processes, including attention, that are instantiated as reentrant loops between frontal and posterior cortical areas, as well as sub-cortical structures. The nature of such interactions can account for capacity limitations, lifespan changes, and restricted transfer after working-memory training. Recent data and models indicate that working memory may also be based on synaptic plasticity and that working memory can operate on non-consciously perceived information.

  • 43.
    Fiebig, Florian
    et al.
    KTH, School of Computer Science and Communication (CSC), Computational Science and Technology (CST). Edinburgh University, UK.
    Lansner, Anders
    KTH, School of Computer Science and Communication (CSC), Computational Science and Technology (CST). Stockholm University, Sweden.
    A Spiking Working Memory Model Based on Hebbian Short-Term Potentiation2017In: Journal of Neuroscience, ISSN 0270-6474, Vol. 37, no 1, p. 83-96Article in journal (Refereed)
    Abstract [en]

    A dominant theory of working memory (WM), referred to as the persistent activity hypothesis, holds that recurrently connected neural networks, presumably located in the prefrontal cortex, encode and maintain WM memory items through sustained elevated activity. Reexamination of experimental data has shown that prefrontal cortex activity in single units during delay periods is much more variable than predicted by such a theory and associated computational models. Alternative models of WM maintenance based on synaptic plasticity, such as short-term nonassociative (non-Hebbian) synaptic facilitation, have been suggested but cannot account for encoding of novel associations. Here we test the hypothesis that a recently identified fast-expressing form of Hebbian synaptic plasticity (associative short-term potentiation) is a possible mechanism for WM encoding and maintenance. Our simulations using a spiking neural network model of cortex reproduce a range of cognitive memory effects in the classical multi-item WM task of encoding and immediate free recall of word lists. Memory reactivation in the model occurs in discrete oscillatory bursts rather than as sustained activity. We relate dynamic network activity as well as key synaptic characteristics to electrophysiological measurements. Our findings support the hypothesis that fast Hebbian short-term potentiation is a key WM mechanism.

  • 44.
    Fiebig, Florian
    et al.
    KTH, School of Computer Science and Communication (CSC), Computational Biology, CB. Stockholm Univ, Sweden.
    Lansner, Anders
    KTH, School of Computer Science and Communication (CSC), Computational Biology, CB. Stockholm University, Sweden.
    Memory Consolidation from Seconds to Weeks Through Autonomous Reinstatement Dynamics in a Three-Stage Neural Network Model2015In: ADVANCES IN COGNITIVE NEURODYNAMICS (IV), SPRINGER , 2015, p. 47-53Conference paper (Refereed)
    Abstract [en]

    Long-term memories for facts and events are not created at an instant. Rather, memories stabilize gradually over time and involve various brain regions. The shifting dependence of acquired declarative memories on different brain regions - called systems consolidation - can be tracked in time by lesion experiments and has led to the development of the Complementary Learning Systems framework, which focuses on hippocampal-cortical interaction. Observations of temporally graded retrograde amnesia following hippocampal lesions, point to a gradual transfer from hippocampus to cortical long-term memory. Spontaneous reactivations of hippocampal memories, as observed in place cell reactivations during slow-wave-sleep, are supposed to drive cortical reinstatements and facilitate this process. We propose a functional neural network implementation of these ideas and furthermore suggest an extended three-stage framework that also includes the prefrontal cortex and bridges the temporal chasm between working memory percepts on the scale of seconds and consolidated long-term memory on the scale of weeks or months. We show that our three-stage model can autonomously produce the necessary stochastic reactivation dynamics for successful episodic memory consolidation. The resulting learning system is shown to exhibit classical memory effects seen in experimental studies, such as retrograde and anterograde amnesia after simulated hippocampal lesioning.

  • 45.
    Flisberg, Anders
    et al.
    Sahlgrenska Akademin, Göteborg universitet.
    Kjellmer, Ingemar
    Sahlgrenska Akademin, Göteborg universitet.
    Bågenholm, Ralph
    Sahlgrenska Akademin, Göteborg universitet.
    Lindecrantz, Kaj
    Högskolan i Borås.
    Löfgren, Nils
    Högskolan i Borås.
    Thordstein, Magnus
    Sahlgrenska Akademin, Göteborg universitet.
    EEG and spectral edge frequency: analysis in posthypoxic newborn piglets2010In: Neuro - endocrinology letters, ISSN 0172-780X, Vol. 31, no 2, p. 181-6Article in journal (Refereed)
    Abstract [en]

    OBJECTIVE: To evaluate the frequency content of the electroencephalogram (EEG) during recovery after a severe hypoxic insult in newborn piglets.

    METHODS: EEG was continuously monitored in nine newborn piglets exposed to a severe hypoxic period. Power spectra in five frequency bands were calculated using Fourier transformation. Spectral edge frequency 90 (SEF90) was defined as the frequency below which 90% of the power in the EEG was located. The piglets were divided into two groups; Group 1 represented piglets with some EEG recovery and Group 2 represented piglets without any EEG recovery.

    RESULTS: The recovery of the EEG in Group 1 had the same time course in all frequency bands. SEF90 indicates recovery earlier than the value of total power. But SEF90 also signals activity in the EEGs that were almost completely suppressed. When SEF90 was calculated during periods of periodic EEG activity during the very early phase of recovery, the values fell within the same range as during the control period.

    CONCLUSION: Spectral analysis of continuous EEG in newborn piglets exposed to very severe hypoxia showed that no specific frequency band of the EEG preceded the other ones during recovery. The results of the SEF90 measure, demonstrates the need for critical analysis of the raw EEG before any reliable estimation of cerebral function can be made.

  • 46.
    Fransén, Erik
    KTH, School of Computer Science and Communication (CSC), Numerical Analysis and Computer Science, NADA.
    A synapse which can switch from inhibitory to excitatory and back2005In: Neurocomputing, ISSN 0925-2312, E-ISSN 1872-8286, Vol. 65, p. 39-45Article in journal (Refereed)
    Abstract [en]

    Co-release of transmitters has recently been observed at synapse terminals and can even be a combination such as glutamate and GABA. A second recent experimental finding is a short-term synaptic plasticity, which depends on postsynaptic depolarization releasing a dendritic transmitter, which affects presynaptic release probability. In this work we are investigating the functional consequences for a synapse if it had both co-release and conditioning depression. If initially the GABA component is larger than the glutamate component, the synapse has an inhibitory net effect. However, if the postsynaptic cell is conditioned, the GABA component will be suppressed yielding an excitatory synapse.

  • 47.
    Fransén, Erik
    KTH, Superseded Departments, Numerical Analysis and Computer Science, NADA.
    Coexistence of synchronized oscillatory and desynchronized rate activity in cortical networks2003In: Neurocomputing, ISSN 0925-2312, E-ISSN 1872-8286, Vol. apr-52, p. 763-769Article in journal (Refereed)
    Abstract [en]

    The basis of MRI and PET experiments is the finding that neuronal cell firing levels are modulated in a task dependent manner. Results from EEG and MEG experiments on the other hand point to the importance of synchrony, e.g. the peak frequency may depend on the difficulty of the task. In most models only one of these activity modes of firing is desirable or possible to produce. In this work we show how a cortical microcircuit can produce either synchronized or desynchronized firing, and how this solves problems of present day rate and synchronization models.

  • 48.
    Fransén, Erik
    KTH, Superseded Departments, Numerical Analysis and Computer Science, NADA.
    Functional role of Entorhinal cortex in working memory and information processing of the medial temporal lobe2004In: 2004 IEEE INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, VOLS 1-4, PROCEEDINGS, 2004, p. 621-624Conference paper (Refereed)
    Abstract [en]

    Our learning and memory system has the challenge to work in a world where items to learn are dispersed in space and time. From the information extracted by the perceptual systems, the learning system must select and combine information. Both these operations may require a temporary storage where significance and correlations may be assessed. This work builds on the common hypothesis that hippocampus and subicular, entorhinal and parahippocampal/postrhinal areas are essential for these functions. We bring up two examples of models, one modeling in vivo and in vitro data from entorhinal cortex layer II of delay match-to-sample working memory experiments, and one modeling slice data from layer V showing cellular "integrator-like" intrinsically generated stable graded levels of spiking activity. In both cases we discuss how cationic currents might be involved and relate their kinetics and pharmacology to behavioral and cellular experimental results.

  • 49.
    Fransén, Erik
    KTH, School of Computer Science and Communication (CSC), Numerical Analysis and Computer Science, NADA.
    Functional role of entorhinal cortex in working memory processing2005In: Neural Networks, ISSN 0893-6080, E-ISSN 1879-2782, Vol. 18, no 9, p. 1141-1149Article in journal (Refereed)
    Abstract [en]

    Our learning and memory system has the challenge to work in a world where items to learn are dispersed in space and time. From the information extracted by the perceptual systems, the learning system must select and combine information. Both these operations may require a temporary storage where significance and correlations could be assessed. This work builds on the common hypothesis that hippocampus and subicular, entorhinal and parahippocampal/postrhinal areas are essential for the above-mentioned functions. We bring up two examples of models: the first one is modeling of in vivo and in vitro data from entorhinal cortex layer 11 of delayed match-to-sample working memory experiments, the second one studying mechanisms in theta rhythmicity in EC. In both cases, we discuss how cationic currents might be involved and relate their kinetics and pharmacology to behavioral and cellular experimental results.

  • 50.
    Fransén, Erik
    KTH, School of Computer Science and Communication (CSC), Computational Biology, CB.
    Neural response profile design: Reducing epileptogenic activity by modifying neuron responses to synchronized input using novel potassium channels obtained by parameter search optimization2007In: Neurocomputing, ISSN 0925-2312, E-ISSN 1872-8286, Vol. 70, no 10-12, p. 1630-1634Article in journal (Refereed)
    Abstract [en]

    Neurons obtain their dynamical electrical characteristics by a set of ion channels. These properties may not only affect the function of the neuron and the local network it forms part of, but it may also eventually affect behavior. We were interested to study whether epileptogenic activity could be reduced by adding an ion channel. In this work, we used computational search techniques to optimize ion channel properties for the goal of modifying neural response characteristics. Our results show that this type of parameter search is possible and reasonably efficient. Successful searches were generated using the direct method PRAXIS, and by systematic searches in low-dimensional sub-spaces. We also report on unsuccessful searches using a simplex-type method, a gradient-based method, and attempts to reduce goal function evaluation time. Importantly, using this search strategy, our study has shown that it is possible to change a neuron's characteristics selectively with regard to response to degree of synchronicity in synaptic input.

12345 1 - 50 of 203
CiteExportLink to result list
Permanent link
Cite
Citation style
  • apa
  • harvard1
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf