30 research outputs found

    A CMOS Spiking Neuron for Brain-Inspired Neural Networks with Resistive Synapses and In-Situ Learning

    Get PDF
    Nanoscale resistive memories are expected to fuel dense integration of electronic synapses for large-scale neuromorphic system. To realize such a brain-inspired computing chip, a compact CMOS spiking neuron that performs in-situ learning and computing while driving a large number of resistive synapses is desired. This work presents a novel leaky integrate-and-fire neuron design which implements the dual-mode operation of current integration and synaptic drive, with a single opamp and enables in-situ learning with crossbar resistive synapses. The proposed design was implemented in a 0.18 μ\mum CMOS technology. Measurements show neuron's ability to drive a thousand resistive synapses, and demonstrate an in-situ associative learning. The neuron circuit occupies a small area of 0.01 mm2^2 and has an energy-efficiency of 9.3 pJ//spike//synapse

    A CMOS Synapse Design Implementing Tunable Asymmetric Spike Timing-Dependent Plasticity

    Get PDF
    A CMOS synapse design is presented which can perform tunable asymmetric spike timing-dependent learning in asynchronous spiking neural networks. The overall design consists of three primary subcircuit blocks, and the operation of each is described. Pair-based Spike Timing-Dependent Plasticity (STDP) of the entire synapse is then demonstrated through simulation using the Cadence Virtuoso platform. Tuning of the STDP curve learning window and rate of synaptic weight change is possible using various control parameters. With appropriate settings, it is shown the resulting learning rule closely matches that observed in biological systems

    A CMOS Spiking Neuron for Brain-Inspired Neural Networks with Resistive Synapses and \u3cem\u3eIn-Situ\u3c/em\u3e Learning

    Get PDF
    Nano-scale resistive memories are expected to fuel dense integration of electronic synapses for large-scale neuromorphic system. To realize such a brain-inspired computing chip, a compact CMOS spiking neuron that performs in-situ learning and computing while driving a large number of resistive synapses is desired. This work presents a novel leaky integrate-and-fire neuron design which implements the dual-mode operation of current integration and synaptic drive, with a single opamp and enables in-situ learning with crossbar resistive synapses. The proposed design was implemented in a 0.18μm CMOS technology. Measurements show neuron’s ability to drive a thousand resistive synapses, and demonstrate an in-situ associative learning. The neuron circuit occupies a small area of 0.01mm2 and has an energy-efficiency of 9.3pJ/spike/synapse

    An On-chip Trainable and Clock-less Spiking Neural Network with 1R Memristive Synapses

    Full text link
    Spiking neural networks (SNNs) are being explored in an attempt to mimic brain's capability to learn and recognize at low power. Crossbar architecture with highly scalable Resistive RAM or RRAM array serving as synaptic weights and neuronal drivers in the periphery is an attractive option for SNN. Recognition (akin to reading the synaptic weight) requires small amplitude bias applied across the RRAM to minimize conductance change. Learning (akin to writing or updating the synaptic weight) requires large amplitude bias pulses to produce a conductance change. The contradictory bias amplitude requirement to perform reading and writing simultaneously and asynchronously, akin to biology, is a major challenge. Solutions suggested in the literature rely on time-division-multiplexing of read and write operations based on clocks, or approximations ignoring the reading when coincidental with writing. In this work, we overcome this challenge and present a clock-less approach wherein reading and writing are performed in different frequency domains. This enables learning and recognition simultaneously on an SNN. We validate our scheme in SPICE circuit simulator by translating a two-layered feed-forward Iris classifying SNN to demonstrate software-equivalent performance. The system performance is not adversely affected by a voltage dependence of conductance in realistic RRAMs, despite departing from linearity. Overall, our approach enables direct implementation of biological SNN algorithms in hardware

    Analog Spiking Neural Network Implementing Spike Timing-Dependent Plasticity on 65 nm CMOS

    Get PDF
    Machine learning is a rapidly accelerating tool and technology used for countless applications in the modern world. There are many digital algorithms to deploy a machine learning program, but the most advanced and well-known algorithm is the artificial neural network (ANN). While ANNs demonstrate impressive reinforcement learning behaviors, they require large power consumption to operate. Therefore, an analog spiking neural network (SNN) implementing spike timing-dependent plasticity is proposed, developed, and tested to demonstrate equivalent learning abilities with fractional power consumption compared to its digital adversary

    The Concept of Metal-Insulator-Metal Nanostructures as Adaptive Neural Networks

    Get PDF
    Present computer processing capabilities are becoming a restriction to meet modern technological needs. Therefore, approaches beyond the von Neumann computational architecture are imperative and the brain operation and structure are truly attractive models. Memristors are characterized by a nonlinear relationship between current history and voltage and were shown to present properties resembling those of biological synapses. Here, the use of metal-insulator-metal-based memristive devices in neural networks capable of simulating the learning and adaptation features present in mammal brains is discussed
    corecore