1,991 research outputs found
A memristive nanoparticle/organic hybrid synapstor for neuro-inspired computing
A large effort is devoted to the research of new computing paradigms
associated to innovative nanotechnologies that should complement and/or propose
alternative solutions to the classical Von Neumann/CMOS association. Among
various propositions, Spiking Neural Network (SNN) seems a valid candidate. (i)
In terms of functions, SNN using relative spike timing for information coding
are deemed to be the most effective at taking inspiration from the brain to
allow fast and efficient processing of information for complex tasks in
recognition or classification. (ii) In terms of technology, SNN may be able to
benefit the most from nanodevices, because SNN architectures are intrinsically
tolerant to defective devices and performance variability. Here we demonstrate
Spike-Timing-Dependent Plasticity (STDP), a basic and primordial learning
function in the brain, with a new class of synapstor (synapse-transistor),
called Nanoparticle Organic Memory Field Effect Transistor (NOMFET). We show
that this learning function is obtained with a simple hybrid material made of
the self-assembly of gold nanoparticles and organic semiconductor thin films.
Beyond mimicking biological synapses, we also demonstrate how the shape of the
applied spikes can tailor the STDP learning function. Moreover, the experiments
and modeling show that this synapstor is a memristive device. Finally, these
synapstors are successfully coupled with a CMOS platform emulating the pre- and
post-synaptic neurons, and a behavioral macro-model is developed on usual
device simulator.Comment: A single pdf file, with the full paper and the supplementary
information; Adv. Func. Mater., on line Dec. 13 (2011
Design and Implementation of BCM Rule Based on Spike-Timing Dependent Plasticity
The Bienenstock-Cooper-Munro (BCM) and Spike Timing-Dependent Plasticity
(STDP) rules are two experimentally verified form of synaptic plasticity where
the alteration of synaptic weight depends upon the rate and the timing of pre-
and post-synaptic firing of action potentials, respectively. Previous studies
have reported that under specific conditions, i.e. when a random train of
Poissonian distributed spikes are used as inputs, and weight changes occur
according to STDP, it has been shown that the BCM rule is an emergent property.
Here, the applied STDP rule can be either classical pair-based STDP rule, or
the more powerful triplet-based STDP rule. In this paper, we demonstrate the
use of two distinct VLSI circuit implementations of STDP to examine whether BCM
learning is an emergent property of STDP. These circuits are stimulated with
random Poissonian spike trains. The first circuit implements the classical
pair-based STDP, while the second circuit realizes a previously described
triplet-based STDP rule. These two circuits are simulated using 0.35 um CMOS
standard model in HSpice simulator. Simulation results demonstrate that the
proposed triplet-based STDP circuit significantly produces the threshold-based
behaviour of the BCM. Also, the results testify to similar behaviour for the
VLSI circuit for pair-based STDP in generating the BCM
Memory and information processing in neuromorphic systems
A striking difference between brain-inspired neuromorphic processors and
current von Neumann processors architectures is the way in which memory and
processing is organized. As Information and Communication Technologies continue
to address the need for increased computational power through the increase of
cores within a digital processor, neuromorphic engineers and scientists can
complement this need by building processor architectures where memory is
distributed with the processing. In this paper we present a survey of
brain-inspired processor architectures that support models of cortical networks
and deep neural networks. These architectures range from serial clocked
implementations of multi-neuron systems to massively parallel asynchronous ones
and from purely digital systems to mixed analog/digital systems which implement
more biological-like models of neurons and synapses together with a suite of
adaptation and learning mechanisms analogous to the ones found in biological
nervous systems. We describe the advantages of the different approaches being
pursued and present the challenges that need to be addressed for building
artificial neural processing systems that can display the richness of behaviors
seen in biological systems.Comment: Submitted to Proceedings of IEEE, review of recently proposed
neuromorphic computing platforms and system
Filamentary Switching: Synaptic Plasticity through Device Volatility
Replicating the computational functionalities and performances of the brain
remains one of the biggest challenges for the future of information and
communication technologies. Such an ambitious goal requires research efforts
from the architecture level to the basic device level (i.e., investigating the
opportunities offered by emerging nanotechnologies to build such systems).
Nanodevices, or, more precisely, memory or memristive devices, have been
proposed for the implementation of synaptic functions, offering the required
features and integration in a single component. In this paper, we demonstrate
that the basic physics involved in the filamentary switching of electrochemical
metallization cells can reproduce important biological synaptic functions that
are key mechanisms for information processing and storage. The transition from
short- to long-term plasticity has been reported as a direct consequence of
filament growth (i.e., increased conductance) in filamentary memory devices. In
this paper, we show that a more complex filament shape, such as dendritic paths
of variable density and width, can permit the short- and long-term processes to
be controlled independently. Our solid-state device is strongly analogous to
biological synapses, as indicated by the interpretation of the results from the
framework of a phenomenological model developed for biological synapses. We
describe a single memristive element containing a rich panel of features, which
will be of benefit to future neuromorphic hardware systems
Pavlov's dog associative learning demonstrated on synaptic-like organic transistors
In this letter, we present an original demonstration of an associative
learning neural network inspired by the famous Pavlov's dogs experiment. A
single nanoparticle organic memory field effect transistor (NOMFET) is used to
implement each synapse. We show how the physical properties of this dynamic
memristive device can be used to perform low power write operations for the
learning and implement short-term association using temporal coding and spike
timing dependent plasticity based learning. An electronic circuit was built to
validate the proposed learning scheme with packaged devices, with good
reproducibility despite the complex synaptic-like dynamic of the NOMFET in
pulse regime
- …