44 research outputs found
Event-Driven Contrastive Divergence for Spiking Neuromorphic Systems
Restricted Boltzmann Machines (RBMs) and Deep Belief Networks have been
demonstrated to perform efficiently in a variety of applications, such as
dimensionality reduction, feature learning, and classification. Their
implementation on neuromorphic hardware platforms emulating large-scale
networks of spiking neurons can have significant advantages from the
perspectives of scalability, power dissipation and real-time interfacing with
the environment. However the traditional RBM architecture and the commonly used
training algorithm known as Contrastive Divergence (CD) are based on discrete
updates and exact arithmetics which do not directly map onto a dynamical neural
substrate. Here, we present an event-driven variation of CD to train a RBM
constructed with Integrate & Fire (I&F) neurons, that is constrained by the
limitations of existing and near future neuromorphic hardware platforms. Our
strategy is based on neural sampling, which allows us to synthesize a spiking
neural network that samples from a target Boltzmann distribution. The recurrent
activity of the network replaces the discrete steps of the CD algorithm, while
Spike Time Dependent Plasticity (STDP) carries out the weight updates in an
online, asynchronous fashion. We demonstrate our approach by training an RBM
composed of leaky I&F neurons with STDP synapses to learn a generative model of
the MNIST hand-written digit dataset, and by testing it in recognition,
generation and cue integration tasks. Our results contribute to a machine
learning-driven approach for synthesizing networks of spiking neurons capable
of carrying out practical, high-level functionality.Comment: (Under review
Organic Log-Domain Integrator Synapse
Synapses play a critical role in memory, learning, and cognition. Their main functions include converting presynaptic voltage spikes to postsynaptic currents, as well as scaling the input signal. Several brain-inspired architectures have been proposed to emulate the behavior of biological synapses. While these are useful to explore the properties of nervous systems, the challenge of making biocompatible and flexible circuits with biologically plausible time constants and tunable gain remains. Here, a physically flexible organic log-domain integrator synaptic circuit is shown to address this challenge. In particular, the circuit is fabricated using organic-based materials that are electrically active, offer flexibility and biocompatibility, as well as time constants (critical in learning neural codes and encoding spatiotemporal patterns) that are biologically plausible. Using a 10 nF synaptic capacitor, the time constant reached 126 and 221 ms before and during bending, respectively. The flexible synaptic circuit is characterized before and during bending, followed with studies on the effects of weighting voltage, synaptic capacitance, and disparity in presynaptic signals on the time constant
2022 roadmap on neuromorphic computing and engineering
Modern computation based on von Neumann architecture is now a mature cutting-edge science. In the von Neumann architecture, processing and memory units are implemented as separate blocks interchanging data intensively and continuously. This data transfer is responsible for a large part of the power consumption. The next generation computer technology is expected to solve problems at the exascale with 10 calculations each second. Even though these future computers will be incredibly powerful, if they are based on von Neumann type architectures, they will consume between 20 and 30 megawatts of power and will not have intrinsic physically built-in capabilities to learn or deal with complex data as our brain does. These needs can be addressed by neuromorphic computing systems which are inspired by the biological concepts of the human brain. This new generation of computers has the potential to be used for the storage and processing of large amounts of digital information with much lower power consumption than conventional processors. Among their potential future applications, an important niche is moving the control from data centers to edge devices. The aim of this roadmap is to present a snapshot of the present state of neuromorphic technology and provide an opinion on the challenges and opportunities that the future holds in the major areas of neuromorphic technology, namely materials, devices, neuromorphic circuits, neuromorphic algorithms, applications, and ethics. The roadmap is a collection of perspectives where leading researchers in the neuromorphic community provide their own view about the current state and the future challenges for each research area. We hope that this roadmap will be a useful resource by providing a concise yet comprehensive introduction to readers outside this field, for those who are just entering the field, as well as providing future perspectives for those who are well established in the neuromorphic computing community
Networks of spiking neurons and plastic synapses: implementation and control
The brain is an incredible system with a computational power that goes further beyond those
of our standard computer. It consists of a network of 1011 neurons connected by about 1014
synapses: a massive parallel architecture that suggests that brain performs computation
according to completely new strategies which we are far from understanding.
To study the nervous system a reasonable starting point is to model its basic units,
neurons and synapses, extract the key features, and try to put them together in simple
controllable networks. The research group I have been working in focuses its attention on
the network dynamics and chooses to model neurons and synapses at a functional level: in
this work I consider network of integrate-and-fire neurons connected through synapses that
are plastic and bistable. A synapses is said to be plastic when, according to some kind of
internal dynamics, it is able to change the “strength”, the efficacy, of the connection between
the pre- and post-synaptic neuron. The adjective bistable refers to the number of stable
states of efficacy that a synapse can have; we consider synapses with two stable states:
potentiated (high efficacy) or depressed (low efficacy). The considered synaptic model is
also endowed with a new stop-learning mechanism particularly relevant when dealing with
highly correlated patterns.
The ability of this kind of systems of reproducing in simulation behaviors observed in
biological networks, give sense to an attempt of implementing in hardware the studied
network. This thesis situates at this point: the goal of this work is to design, control and
test hybrid analog-digital, biologically inspired, hardware systems that behave in agreement
with the theoretical and simulations predictions. This class of devices typically goes under
the name of neuromorphic VLSI (Very-Large-Scale Integration). Neuromorphic engineering
was born from the idea of designing bio-mimetic devices and represents a useful research
strategy that contributes to inspire new models, stimulates the theoretical research and that
proposes an effective way of implementing stand-alone power-efficient devices.
In this work I present two chips, a prototype and a larger device, that are a step towards
endowing VLSI, neuromorphic systems with autonomous learning capabilities adequate for
not too simple statistics of the stimuli to be learnt. The main novel features of these
chips are the implemented type of synaptic plasticity and the configurability of the synaptic
connectivity. The reported experimental results demonstrate that the circuits behave in
agreement with theoretical predictions and the advantages of the stop-learning synaptic
plasticity when highly correlated patterns have to be learnt. The high degree of flexibility
of these chips in the definition of the synaptic connectivity is relevant in the perspective of
using such devices as building blocks of parallel, distributed multi-chip architectures that
will allow to scale up the network dimensions to systems with interesting computational
abilities capable to interact with real-world stimuli