24,542 research outputs found
A Reconfigurable Mixed-signal Implementation of a Neuromorphic ADC
We present a neuromorphic Analogue-to-Digital Converter (ADC), which uses
integrate-and-fire (I&F) neurons as the encoders of the analogue signal, with
modulated inhibitions to decohere the neuronal spikes trains. The architecture
consists of an analogue chip and a control module. The analogue chip comprises
two scan chains and a twodimensional integrate-and-fire neuronal array.
Individual neurons are accessed via the chains one by one without any encoder
decoder or arbiter. The control module is implemented on an FPGA (Field
Programmable Gate Array), which sends scan enable signals to the scan chains
and controls the inhibition for individual neurons. Since the control module is
implemented on an FPGA, it can be easily reconfigured. Additionally, we propose
a pulse width modulation methodology for the lateral inhibition, which makes
use of different pulse widths indicating different strengths of inhibition for
each individual neuron to decohere neuronal spikes. Software simulations in
this paper tested the robustness of the proposed ADC architecture to fixed
random noise. A circuit simulation using ten neurons shows the performance and
the feasibility of the architecture.Comment: BioCAS-201
Asynchronous spiking neurons, the natural key to exploit temporal sparsity
Inference of Deep Neural Networks for stream signal (Video/Audio) processing in edge devices is still challenging. Unlike the most state of the art inference engines which are efficient for static signals, our brain is optimized for real-time dynamic signal processing. We believe one important feature of the brain (asynchronous state-full processing) is the key to its excellence in this domain. In this work, we show how asynchronous processing with state-full neurons allows exploitation of the existing sparsity in natural signals. This paper explains three different types of sparsity and proposes an inference algorithm which exploits all types of sparsities in the execution of already trained networks. Our experiments in three different applications (Handwritten digit recognition, Autonomous Steering and Hand-Gesture recognition) show that this model of inference reduces the number of required operations for sparse input data by a factor of one to two orders of magnitudes. Additionally, due to fully asynchronous processing this type of inference can be run on fully distributed and scalable neuromorphic hardware platforms
Neuromorphic Hardware In The Loop: Training a Deep Spiking Network on the BrainScaleS Wafer-Scale System
Emulating spiking neural networks on analog neuromorphic hardware offers
several advantages over simulating them on conventional computers, particularly
in terms of speed and energy consumption. However, this usually comes at the
cost of reduced control over the dynamics of the emulated networks. In this
paper, we demonstrate how iterative training of a hardware-emulated network can
compensate for anomalies induced by the analog substrate. We first convert a
deep neural network trained in software to a spiking network on the BrainScaleS
wafer-scale neuromorphic system, thereby enabling an acceleration factor of 10
000 compared to the biological time domain. This mapping is followed by the
in-the-loop training, where in each training step, the network activity is
first recorded in hardware and then used to compute the parameter updates in
software via backpropagation. An essential finding is that the parameter
updates do not have to be precise, but only need to approximately follow the
correct gradient, which simplifies the computation of updates. Using this
approach, after only several tens of iterations, the spiking network shows an
accuracy close to the ideal software-emulated prototype. The presented
techniques show that deep spiking networks emulated on analog neuromorphic
devices can attain good computational performance despite the inherent
variations of the analog substrate.Comment: 8 pages, 10 figures, submitted to IJCNN 201
SIMPEL: Circuit model for photonic spike processing laser neurons
We propose an equivalent circuit model for photonic spike processing laser
neurons with an embedded saturable absorber---a simulation model for photonic
excitable lasers (SIMPEL). We show that by mapping the laser neuron rate
equations into a circuit model, SPICE analysis can be used as an efficient and
accurate engine for numerical calculations, capable of generalization to a
variety of different laser neuron types found in literature. The development of
this model parallels the Hodgkin--Huxley model of neuron biophysics, a circuit
framework which brought efficiency, modularity, and generalizability to the
study of neural dynamics. We employ the model to study various
signal-processing effects such as excitability with excitatory and inhibitory
pulses, binary all-or-nothing response, and bistable dynamics.Comment: 16 pages, 7 figure
A biophysical observation model for field potentials of networks of leaky integrate-and-fire neurons
We present a biophysical approach for the coupling of neural network activity
as resulting from proper dipole currents of cortical pyramidal neurons to the
electric field in extracellular fluid. Starting from a reduced threecompartment
model of a single pyramidal neuron, we derive an observation model for
dendritic dipole currents in extracellular space and thereby for the dendritic
field potential that contributes to the local field potential of a neural
population. This work aligns and satisfies the widespread dipole assumption
that is motivated by the "open-field" configuration of the dendritic field
potential around cortical pyramidal cells. Our reduced three-compartment scheme
allows to derive networks of leaky integrate-and-fire models, which facilitates
comparison with existing neural network and observation models. In particular,
by means of numerical simulations we compare our approach with an ad hoc model
by Mazzoni et al. [Mazzoni, A., S. Panzeri, N. K. Logothetis, and N. Brunel
(2008). Encoding of naturalistic stimuli by local field potential spectra in
networks of excitatory and inhibitory neurons. PLoS Computational Biology 4
(12), e1000239], and conclude that our biophysically motivated approach yields
substantial improvement.Comment: 31 pages, 4 figure
- …