966 research outputs found

    Memory and information processing in neuromorphic systems

    Full text link
    A striking difference between brain-inspired neuromorphic processors and current von Neumann processors architectures is the way in which memory and processing is organized. As Information and Communication Technologies continue to address the need for increased computational power through the increase of cores within a digital processor, neuromorphic engineers and scientists can complement this need by building processor architectures where memory is distributed with the processing. In this paper we present a survey of brain-inspired processor architectures that support models of cortical networks and deep neural networks. These architectures range from serial clocked implementations of multi-neuron systems to massively parallel asynchronous ones and from purely digital systems to mixed analog/digital systems which implement more biological-like models of neurons and synapses together with a suite of adaptation and learning mechanisms analogous to the ones found in biological nervous systems. We describe the advantages of the different approaches being pursued and present the challenges that need to be addressed for building artificial neural processing systems that can display the richness of behaviors seen in biological systems.Comment: Submitted to Proceedings of IEEE, review of recently proposed neuromorphic computing platforms and system

    A CMOS Spiking Neuron for Brain-Inspired Neural Networks with Resistive Synapses and In-Situ Learning

    Get PDF
    Nanoscale resistive memories are expected to fuel dense integration of electronic synapses for large-scale neuromorphic system. To realize such a brain-inspired computing chip, a compact CMOS spiking neuron that performs in-situ learning and computing while driving a large number of resistive synapses is desired. This work presents a novel leaky integrate-and-fire neuron design which implements the dual-mode operation of current integration and synaptic drive, with a single opamp and enables in-situ learning with crossbar resistive synapses. The proposed design was implemented in a 0.18 μ\mum CMOS technology. Measurements show neuron's ability to drive a thousand resistive synapses, and demonstrate an in-situ associative learning. The neuron circuit occupies a small area of 0.01 mm2^2 and has an energy-efficiency of 9.3 pJ//spike//synapse

    Highly Scalable Neuromorphic Hardware with 1-bit Stochastic nano-Synapses

    Full text link
    Thermodynamic-driven filament formation in redox-based resistive memory and the impact of thermal fluctuations on switching probability of emerging magnetic switches are probabilistic phenomena in nature, and thus, processes of binary switching in these nonvolatile memories are stochastic and vary from switching cycle-to-switching cycle, in the same device, and from device-to-device, hence, they provide a rich in-situ spatiotemporal stochastic characteristic. This work presents a highly scalable neuromorphic hardware based on crossbar array of 1-bit resistive crosspoints as distributed stochastic synapses. The network shows a robust performance in emulating selectivity of synaptic potentials in neurons of primary visual cortex to the orientation of a visual image. The proposed model could be configured to accept a wide range of nanodevices.Comment: 9 pages, 6 figure
    • …
    corecore