8 research outputs found

    Calcium control of triphasic hippocampal STDP

    Get PDF
    Bush D, Jin Y. Calcium control of triphasic hippocampal STDP. Journal of Computational Neuroscience. 2012;33(3):495-514.Synaptic plasticity is believed to represent the neural correlate of mammalian learning and memory function. It has been demonstrated that changes in synaptic conductance can be induced by approximately synchronous pairings of pre- and post- synaptic action potentials delivered at low frequencies. It has also been established that NMDAr-dependent calcium influx into dendritic spines represents a critical signal for plasticity induction, and can account for this spike-timing dependent plasticity (STDP) as well as experimental data obtained using other stimulation protocols. However, subsequent empirical studies have delineated a more complex relationship between spike-timing, firing rate, stimulus duration and post-synaptic bursting in dictating changes in the conductance of hippocampal excitatory synapses. Here, we present a detailed biophysical model of single dendritic spines on a CA1 pyramidal neuron, describe the NMDAr-dependent calcium influx generated by different stimulation protocols, and construct a parsimonious model of calcium driven kinase and phosphatase dynamics that dictate the probability of stochastic transitions between binary synaptic weight states in a Markov model. We subsequently demonstrate that this approach can account for a range of empirical observations regarding the dynamics of synaptic plasticity induced by different stimulation protocols, under regimes of pharmacological blockade and metaplasticity. Finally, we highlight the strengths and weaknesses of this parsimonious, unified computational synaptic plasticity model, discuss differences between the properties of cortical and hippocampal plasticity highlighted by the experimental literature, and the manner in which further empirical and theoretical research might elucidate the cellular basis of mammalian learning and memory function

    Dendritic spine geometry and spine apparatus organization govern the spatiotemporal dynamics of calcium.

    Get PDF
    Dendritic spines are small subcompartments that protrude from the dendrites of neurons and are important for signaling activity and synaptic communication. These subcompartments have been characterized to have different shapes. While it is known that these shapes are associated with spine function, the specific nature of these shape-function relationships is not well understood. In this work, we systematically investigated the relationship between the shape and size of both the spine head and spine apparatus, a specialized endoplasmic reticulum compartment within the spine head, in modulating rapid calcium dynamics using mathematical modeling. We developed a spatial multicompartment reaction-diffusion model of calcium dynamics in three dimensions with various flux sources, including N-methyl-D-aspartate receptors (NMDARs), voltage-sensitive calcium channels (VSCCs), and different ion pumps on the plasma membrane. Using this model, we make several important predictions. First, the volume to surface area ratio of the spine regulates calcium dynamics. Second, membrane fluxes impact calcium dynamics temporally and spatially in a nonlinear fashion. Finally, the spine apparatus can act as a physical buffer for calcium by acting as a sink and rescaling the calcium concentration. These predictions set the stage for future experimental investigations of calcium dynamics in dendritic spines

    Supervised Learning in Spiking Neural Networks for Precise Temporal Encoding

    Get PDF
    Precise spike timing as a means to encode information in neural networks is biologically supported, and is advantageous over frequency-based codes by processing input features on a much shorter time-scale. For these reasons, much recent attention has been focused on the development of supervised learning rules for spiking neural networks that utilise a temporal coding scheme. However, despite significant progress in this area, there still lack rules that have a theoretical basis, and yet can be considered biologically relevant. Here we examine the general conditions under which synaptic plasticity most effectively takes place to support the supervised learning of a precise temporal code. As part of our analysis we examine two spike-based learning methods: one of which relies on an instantaneous error signal to modify synaptic weights in a network (INST rule), and the other one on a filtered error signal for smoother synaptic weight modifications (FILT rule). We test the accuracy of the solutions provided by each rule with respect to their temporal encoding precision, and then measure the maximum number of input patterns they can learn to memorise using the precise timings of individual spikes as an indication of their storage capacity. Our results demonstrate the high performance of FILT in most cases, underpinned by the rule's error-filtering mechanism, which is predicted to provide smooth convergence towards a desired solution during learning. We also find FILT to be most efficient at performing input pattern memorisations, and most noticeably when patterns are identified using spikes with sub-millisecond temporal precision. In comparison with existing work, we determine the performance of FILT to be consistent with that of the highly efficient E-learning Chronotron, but with the distinct advantage that FILT is also implementable as an online method for increased biological realism.Comment: 26 pages, 10 figures, this version is published in PLoS ONE and incorporates reviewer comment

    Modulation of Spike-Timing Dependent Plasticity: Towards the Inclusion of a Third Factor in Computational Models

    Get PDF
    In spike-timing dependent plasticity (STDP) change in synaptic strength depends on the timing of pre- vs. postsynaptic spiking activity. Since STDP is in compliance with Hebb’s postulate, it is considered one of the major mechanisms of memory storage and recall. STDP comprises a system of two coincidence detectors with N-methyl-D-aspartate receptor (NMDAR) activation often posited as one of the main components. Numerous studies have unveiled a third component of this coincidence detection system, namely neuromodulation and glia activity shaping STDP. Even though dopaminergic control of STDP has most often been reported, acetylcholine, noradrenaline, nitric oxide (NO), brain-derived neurotrophic factor (BDNF) or gamma-aminobutyric acid (GABA) also has been shown to effectively modulate STDP. Furthermore, it has been demonstrated that astrocytes, via the release or uptake of glutamate, gate STDP expression. At the most fundamental level, the timing properties of STDP are expected to depend on the spatiotemporal dynamics of the underlying signaling pathways. However in most cases, due to technical limitations experiments grant only indirect access to these pathways. Computational models carefully constrained by experiments, allow for a better qualitative understanding of the molecular basis of STDP and its regulation by neuromodulators. Recently, computational models of calcium dynamics and signaling pathway molecules have started to explore STDP emergence in ex and in vivo-like conditions. These models are expected to reproduce better at least part of the complex modulation of STDP as an emergent property of the underlying molecular pathways. Elucidation of the mechanisms underlying STDP modulation and its consequences on network dynamics is of critical importance and will allow better understanding of the major mechanisms of memory storage and recall both in health and disease

    Energy Efficient Spiking Neuromorphic Architectures for Pattern Recognition

    Get PDF
    There is a growing concern over reliability, power consumption, and performance of traditional Von Neumann machines, especially when dealing with complex tasks like pattern recognition. In contrast, the human brain can address such problems with great ease. Brain-inspired neuromorphic computing has attracted much research interest, as it provides an appealing architectural solution to difficult tasks due to its energy efficiency, built-in parallelism, and potential scalability. Meanwhile, the inherent error resilience in neuro-computing allows promising opportunities for leveraging approximate computing for additional energy and silicon area benefits. This thesis focuses on energy efficient neuromorphic architectures which exploit parallel processing and approximate computing for pattern recognition. Firstly, two parallel spiking neural architectures are presented. The first architecture is based on spiking neural network with global inhibition (SNNGI), which integrates digital leaky integrate-and-fire spiking neurons to mimic their biological counterparts and the corresponding on-chip learning circuits for implementing the spiking timing dependent plasticity rules. In order to achieve efficient parallelization, this work addresses a number of critical issues pertaining to memory organization, parallel processing, hardware reuse for different operating modes, as well as the tradeoffs between throughput, area, and power overheads for different configurations. For the application of handwritten digit recognition, a promising training speedup of 13.5x and a recognition speedup of 25.8x over the serial SNNGI architecture are achieved. In spite of the 120MHz operating frequency, the 32-way parallel hardware design demonstrates a 59.4x training speedup over a 2.2GHz general-purpose CPU. Besides the SNNGI, we also propose another architecture based on the liquid state machine (LSM), a recurrent spiking neural network. The LSM architecture is fully parallelized and consists of randomly connected digital neurons in a reservoir and a readout stage, the latter of which is tuned by a bio-inspired learning rule. When evaluated using the TI46 speech benchmark, the FPGA LSM system demonstrates a runtime speedup of 88x over a 2.3GHz AMD CPU. In addition, approximate computing contributes significantly to the overall energy reduction of the proposed architectures. In particular, addition computations occupy a considerable portion of power and area in the neuromorphic systems, especially in the LSM. By exploiting the built-in resilience of neuro-computing, we propose a real-time reconfigurable approximate adder for FPGA implementation to reduce the energy consumption substantially. Although there exist many mature approximate adders, these designs lose their advantages in terms of area, power, and delay on the FPGA platform. Therefore, a novel approximate adder dedicated to the FPGA is necessary. The proposed adder is based on a carry skip model which reduces carry propagation delay and power, and the resulting errors are controlled by a proposed error analysis method. Also, a real-time adjustable precision mechanism is integrated to further reduce dynamic power consumption. Implemented on the Virtex-6 FPGA, it is shown that the proposed adder consumes 18.7% and 32.6% less power than the built-in Xilinx adder in two precision modes, respectively, and that the approximate adder in both modes is 1.32x faster and requires fewer FPGA resources. Besides the adders, the firing-activity based power gating for silent neurons and booth approximate multipliers are also introduced. These three proposed schemes have been applied to our neuromorphic systems. The approximate errors incurred by these schemes have been shown to be negligible, but energy reductions of up to 20% and 30.1% over the exact training computation are achieved for the SNNGI and LSM systems, respectively

    Calcium control of triphasic hippocampal STDP

    No full text

    Calcium control of triphasic hippocampal STDP

    Get PDF
    Bush D, Jin Y. Calcium control of triphasic hippocampal STDP. Journal of Computational Neuroscience. 2012;33(3):495-514.Synaptic plasticity is believed to represent the neural correlate of mammalian learning and memory function. It has been demonstrated that changes in synaptic conductance can be induced by approximately synchronous pairings of pre- and post- synaptic action potentials delivered at low frequencies. It has also been established that NMDAr-dependent calcium influx into dendritic spines represents a critical signal for plasticity induction, and can account for this spike-timing dependent plasticity (STDP) as well as experimental data obtained using other stimulation protocols. However, subsequent empirical studies have delineated a more complex relationship between spike-timing, firing rate, stimulus duration and post-synaptic bursting in dictating changes in the conductance of hippocampal excitatory synapses. Here, we present a detailed biophysical model of single dendritic spines on a CA1 pyramidal neuron, describe the NMDAr-dependent calcium influx generated by different stimulation protocols, and construct a parsimonious model of calcium driven kinase and phosphatase dynamics that dictate the probability of stochastic transitions between binary synaptic weight states in a Markov model. We subsequently demonstrate that this approach can account for a range of empirical observations regarding the dynamics of synaptic plasticity induced by different stimulation protocols, under regimes of pharmacological blockade and metaplasticity. Finally, we highlight the strengths and weaknesses of this parsimonious, unified computational synaptic plasticity model, discuss differences between the properties of cortical and hippocampal plasticity highlighted by the experimental literature, and the manner in which further empirical and theoretical research might elucidate the cellular basis of mammalian learning and memory function

    Modélisation de la consolidation de la mémoire dépendante de l'état d'activité du cerveau

    Full text link
    Our brains enable us to perform complex actions and respond quickly to the external world, thanks to transitions between different brain states that reflect the activity of interconnected neuronal populations. An intriguing example is the ever-present switch of brain activity that occurs while transitioning between periods of active and quiet waking. It involves transitions from small-amplitude, high-frequency brain oscillations to large-amplitude, low-frequency oscillations, accompanied by neuronal activity switches from tonic firing to bursting. The switch between these firing modes is regulated by neuromodulators and the inherent properties of neurons. Simultaneously, our brains have the ability to learn and form memories through persistent changes in the strength of the connections between neurons. This process is known as synaptic plasticity, where neurons strengthen or weaken connections based on their respective firing activity. While it is commonly believed that putting in more effort and time leads to better performance when memorizing new information, this thesis explores the hypothesis that taking occasional breaks and allowing the brain to rest during quiet waking periods may actually be beneficial. Using a computational approach, the thesis investigates the relationship between the transitions in brain states from active to quiet waking described by the neuronal switches from tonic firing to bursting, and synaptic plasticity on memory consolidation. To investigate this research question, we constructed neurons and circuits with the ability to switch between tonic firing and bursting using a conductance-based approach. In our first contribution, we focused on identifying the key neuronal property that enables robust switches, even in the presence of neuron and circuit heterogeneity. Through computational experiments and phase plane analysis, we demonstrated the significance of a distinct timescale separation between sodium and T-type calcium channel activation by comparing various models from the existing literature. Synaptic plasticity is studied to understand learning and memory consolidation. The second contribution involves a taxonomy of synaptic plasticity rules, investigating their compatibility with switches in neuronal activity, small neuronal variabilities, and neuromodulators. The third contribution reveals the evolution of synaptic weights during the transition from tonic firing in active waking to bursting in quiet waking. Combining bursting neurons with traditional synaptic plasticity rules using soft-bounds leads to a homeostatic reset, where synaptic weights converge to a fixed point regardless of the weights acquired during tonic firing. Strong weights depress, while weak weights potentiate until reaching a set point. This homeostatic mechanism is robust to neuron and circuit heterogeneity and the choice of synaptic plasticity rules. The reset is further exploited by neuromodulator-induced changes in synaptic rules, potentially supporting the Synaptic-Tagging and Capture hypothesis, where strong weights are tagged and converge to a high reset value during bursting. While burst-induced reset may cause forgetting of previous learning, it also restores synaptic weights and facilitates the formation of new memories. To exploit this homeostatic property, an innovative burst-dependent structural plasticity rule is developed to encode previous learning through long-lasting morphological changes. The proposed mechanism explains late-stage of Long-Term Potentiation, complementing traditional synaptic plasticity rules governing early-stage of Long-Term Potentiation. Switches to bursting enable neurons to consolidate synapses by creating new proteins and promoting synapse growth, while simultaneously restoring efficacy of postsynaptic receptors for new learning. The novel plasticity rule is validated by comparing it with traditional synaptic rules in various memory tasks. The results demonstrate that switches from tonic firing to bursting and the novel structural plasticity enhance learning and memory consolidation. In conclusion, this thesis utilizes computational models of biophysical neurons to provide evidence that the switches from tonic firing to bursting, reflecting the shift from active to quiet waking, play a crucial role in enhancing memory consolidation through structural plasticity. In essence, this thesis offers computational support for the significance of taking breaks and allowing our brains to rest in order to solidify our memories. These findings serve as motivation for collaborative experiments between computational and experimental neuroscience, fostering a deeper understanding of the biological mechanisms underlying brain-state-dependent memory consolidation. Furthermore, these insights have the potential to inspire advancements in machine learning algorithms by incorporating principles of neuronal activity switches
    corecore