10,354 research outputs found

    Calmodulin as a major calcium buffer shaping vesicular release and short-term synaptic plasticity : facilitation through buffer dislocation

    Get PDF
    Action potential-dependent release of synaptic vesicles and short-term synaptic plasticity are dynamically regulated by the endogenous Ca2+ buffers that shape [Ca2+] profiles within a presynaptic bouton. Calmodulin is one of the most abundant presynaptic proteins and it binds Ca2+ faster than any other characterized endogenous neuronal Ca2+ buffer. Direct effects of calmodulin on fast presynaptic Ca2+ dynamics and vesicular release however have not been studied in detail. Using experimentally constrained three-dimensional diffusion modeling of Ca2+ influx–exocytosis coupling at small excitatory synapses we show that, at physiologically relevant concentrations, Ca2+ buffering by calmodulin plays a dominant role in inhibiting vesicular release and in modulating short-term synaptic plasticity. We also propose a novel and potentially powerful mechanism for short-term facilitation based on Ca2+-dependent dynamic dislocation of calmodulin molecules from the plasma membrane within the active zone

    Memory and information processing in neuromorphic systems

    Full text link
    A striking difference between brain-inspired neuromorphic processors and current von Neumann processors architectures is the way in which memory and processing is organized. As Information and Communication Technologies continue to address the need for increased computational power through the increase of cores within a digital processor, neuromorphic engineers and scientists can complement this need by building processor architectures where memory is distributed with the processing. In this paper we present a survey of brain-inspired processor architectures that support models of cortical networks and deep neural networks. These architectures range from serial clocked implementations of multi-neuron systems to massively parallel asynchronous ones and from purely digital systems to mixed analog/digital systems which implement more biological-like models of neurons and synapses together with a suite of adaptation and learning mechanisms analogous to the ones found in biological nervous systems. We describe the advantages of the different approaches being pursued and present the challenges that need to be addressed for building artificial neural processing systems that can display the richness of behaviors seen in biological systems.Comment: Submitted to Proceedings of IEEE, review of recently proposed neuromorphic computing platforms and system

    Filamentary Switching: Synaptic Plasticity through Device Volatility

    Full text link
    Replicating the computational functionalities and performances of the brain remains one of the biggest challenges for the future of information and communication technologies. Such an ambitious goal requires research efforts from the architecture level to the basic device level (i.e., investigating the opportunities offered by emerging nanotechnologies to build such systems). Nanodevices, or, more precisely, memory or memristive devices, have been proposed for the implementation of synaptic functions, offering the required features and integration in a single component. In this paper, we demonstrate that the basic physics involved in the filamentary switching of electrochemical metallization cells can reproduce important biological synaptic functions that are key mechanisms for information processing and storage. The transition from short- to long-term plasticity has been reported as a direct consequence of filament growth (i.e., increased conductance) in filamentary memory devices. In this paper, we show that a more complex filament shape, such as dendritic paths of variable density and width, can permit the short- and long-term processes to be controlled independently. Our solid-state device is strongly analogous to biological synapses, as indicated by the interpretation of the results from the framework of a phenomenological model developed for biological synapses. We describe a single memristive element containing a rich panel of features, which will be of benefit to future neuromorphic hardware systems

    Simulation of networks of spiking neurons: A review of tools and strategies

    Full text link
    We review different aspects of the simulation of spiking neural networks. We start by reviewing the different types of simulation strategies and algorithms that are currently implemented. We next review the precision of those simulation strategies, in particular in cases where plasticity depends on the exact timing of the spikes. We overview different simulators and simulation environments presently available (restricted to those freely available, open source and documented). For each simulation tool, its advantages and pitfalls are reviewed, with an aim to allow the reader to identify which simulator is appropriate for a given task. Finally, we provide a series of benchmark simulations of different types of networks of spiking neurons, including Hodgkin-Huxley type, integrate-and-fire models, interacting with current-based or conductance-based synapses, using clock-driven or event-driven integration strategies. The same set of models are implemented on the different simulators, and the codes are made available. The ultimate goal of this review is to provide a resource to facilitate identifying the appropriate integration strategy and simulation tool to use for a given modeling problem related to spiking neural networks.Comment: 49 pages, 24 figures, 1 table; review article, Journal of Computational Neuroscience, in press (2007

    A Comprehensive Workflow for General-Purpose Neural Modeling with Highly Configurable Neuromorphic Hardware Systems

    Full text link
    In this paper we present a methodological framework that meets novel requirements emerging from upcoming types of accelerated and highly configurable neuromorphic hardware systems. We describe in detail a device with 45 million programmable and dynamic synapses that is currently under development, and we sketch the conceptual challenges that arise from taking this platform into operation. More specifically, we aim at the establishment of this neuromorphic system as a flexible and neuroscientifically valuable modeling tool that can be used by non-hardware-experts. We consider various functional aspects to be crucial for this purpose, and we introduce a consistent workflow with detailed descriptions of all involved modules that implement the suggested steps: The integration of the hardware interface into the simulator-independent model description language PyNN; a fully automated translation between the PyNN domain and appropriate hardware configurations; an executable specification of the future neuromorphic system that can be seamlessly integrated into this biology-to-hardware mapping process as a test bench for all software layers and possible hardware design modifications; an evaluation scheme that deploys models from a dedicated benchmark library, compares the results generated by virtual or prototype hardware devices with reference software simulations and analyzes the differences. The integration of these components into one hardware-software workflow provides an ecosystem for ongoing preparative studies that support the hardware design process and represents the basis for the maturity of the model-to-hardware mapping software. The functionality and flexibility of the latter is proven with a variety of experimental results

    Network Plasticity as Bayesian Inference

    Full text link
    General results from statistical learning theory suggest to understand not only brain computations, but also brain plasticity as probabilistic inference. But a model for that has been missing. We propose that inherently stochastic features of synaptic plasticity and spine motility enable cortical networks of neurons to carry out probabilistic inference by sampling from a posterior distribution of network configurations. This model provides a viable alternative to existing models that propose convergence of parameters to maximum likelihood values. It explains how priors on weight distributions and connection probabilities can be merged optimally with learned experience, how cortical networks can generalize learned information so well to novel experiences, and how they can compensate continuously for unforeseen disturbances of the network. The resulting new theory of network plasticity explains from a functional perspective a number of experimental data on stochastic aspects of synaptic plasticity that previously appeared to be quite puzzling.Comment: 33 pages, 5 figures, the supplement is available on the author's web page http://www.igi.tugraz.at/kappe
    • …
    corecore