4 research outputs found
Neuromorphic Hardware In The Loop: Training a Deep Spiking Network on the BrainScaleS Wafer-Scale System
Emulating spiking neural networks on analog neuromorphic hardware offers
several advantages over simulating them on conventional computers, particularly
in terms of speed and energy consumption. However, this usually comes at the
cost of reduced control over the dynamics of the emulated networks. In this
paper, we demonstrate how iterative training of a hardware-emulated network can
compensate for anomalies induced by the analog substrate. We first convert a
deep neural network trained in software to a spiking network on the BrainScaleS
wafer-scale neuromorphic system, thereby enabling an acceleration factor of 10
000 compared to the biological time domain. This mapping is followed by the
in-the-loop training, where in each training step, the network activity is
first recorded in hardware and then used to compute the parameter updates in
software via backpropagation. An essential finding is that the parameter
updates do not have to be precise, but only need to approximately follow the
correct gradient, which simplifies the computation of updates. Using this
approach, after only several tens of iterations, the spiking network shows an
accuracy close to the ideal software-emulated prototype. The presented
techniques show that deep spiking networks emulated on analog neuromorphic
devices can attain good computational performance despite the inherent
variations of the analog substrate.Comment: 8 pages, 10 figures, submitted to IJCNN 201
Robustness from structure: Inference with hierarchical spiking networks on analog neuromorphic hardware
How spiking networks are able to perform probabilistic inference is an
intriguing question, not only for understanding information processing in the
brain, but also for transferring these computational principles to neuromorphic
silicon circuits. A number of computationally powerful spiking network models
have been proposed, but most of them have only been tested, under ideal
conditions, in software simulations. Any implementation in an analog, physical
system, be it in vivo or in silico, will generally lead to distorted dynamics
due to the physical properties of the underlying substrate. In this paper, we
discuss several such distortive effects that are difficult or impossible to
remove by classical calibration routines or parameter training. We then argue
that hierarchical networks of leaky integrate-and-fire neurons can offer the
required robustness for physical implementation and demonstrate this with both
software simulations and emulation on an accelerated analog neuromorphic
device.Comment: accepted at IJCNN 201