908 research outputs found

    Emulating long-term synaptic dynamics with memristive devices

    Get PDF
    The potential of memristive devices is often seeing in implementing neuromorphic architectures for achieving brain-like computation. However, the designing procedures do not allow for extended manipulation of the material, unlike CMOS technology, the properties of the memristive material should be harnessed in the context of such computation, under the view that biological synapses are memristors. Here we demonstrate that single solid-state TiO2 memristors can exhibit associative plasticity phenomena observed in biological cortical synapses, and are captured by a phenomenological plasticity model called triplet rule. This rule comprises of a spike-timing dependent plasticity regime and a classical hebbian associative regime, and is compatible with a large amount of electrophysiology data. Via a set of experiments with our artificial, memristive, synapses we show that, contrary to conventional uses of solid-state memory, the co-existence of field- and thermally-driven switching mechanisms that could render bipolar and/or unipolar programming modes is a salient feature for capturing long-term potentiation and depression synaptic dynamics. We further demonstrate that the non-linear accumulating nature of memristors promotes long-term potentiating or depressing memory transitions

    Independent Component Analysis in Spiking Neurons

    Get PDF
    Although models based on independent component analysis (ICA) have been successful in explaining various properties of sensory coding in the cortex, it remains unclear how networks of spiking neurons using realistic plasticity rules can realize such computation. Here, we propose a biologically plausible mechanism for ICA-like learning with spiking neurons. Our model combines spike-timing dependent plasticity and synaptic scaling with an intrinsic plasticity rule that regulates neuronal excitability to maximize information transmission. We show that a stochastically spiking neuron learns one independent component for inputs encoded either as rates or using spike-spike correlations. Furthermore, different independent components can be recovered, when the activity of different neurons is decorrelated by adaptive lateral inhibition

    Spike-based local synaptic plasticity: A survey of computational models and neuromorphic circuits

    Get PDF
    Understanding how biological neural networks carry out learning using spike-based local plasticity mechanisms can lead to the development of powerful, energy-efficient, and adaptive neuromorphic processing systems. A large number of spike-based learning models have recently been proposed following different approaches. However, it is difficult to assess if and how they could be mapped onto neuromorphic hardware, and to compare their features and ease of implementation. To this end, in this survey, we provide a comprehensive overview of representative brain-inspired synaptic plasticity models and mixed-signal CMOS neuromorphic circuits within a unified framework. We review historical, bottom-up, and top-down approaches to modeling synaptic plasticity, and we identify computational primitives that can support low-latency and low-power hardware implementations of spike-based learning rules. We provide a common definition of a locality principle based on pre- and post-synaptic neuron information, which we propose as a fundamental requirement for physical implementations of synaptic plasticity. Based on this principle, we compare the properties of these models within the same framework, and describe the mixed-signal electronic circuits that implement their computing primitives, pointing out how these building blocks enable efficient on-chip and online learning in neuromorphic processing systems
    • …
    corecore