193 research outputs found
Gibbs Sampling with Low-Power Spiking Digital Neurons
Restricted Boltzmann Machines and Deep Belief Networks have been successfully
used in a wide variety of applications including image classification and
speech recognition. Inference and learning in these algorithms uses a Markov
Chain Monte Carlo procedure called Gibbs sampling. A sigmoidal function forms
the kernel of this sampler which can be realized from the firing statistics of
noisy integrate-and-fire neurons on a neuromorphic VLSI substrate. This paper
demonstrates such an implementation on an array of digital spiking neurons with
stochastic leak and threshold properties for inference tasks and presents some
key performance metrics for such a hardware-based sampler in both the
generative and discriminative contexts.Comment: Accepted at ISCAS 201
Improving classification accuracy of feedforward neural networks for spiking neuromorphic chips
Deep Neural Networks (DNN) achieve human level performance in many image
analytics tasks but DNNs are mostly deployed to GPU platforms that consume a
considerable amount of power. New hardware platforms using lower precision
arithmetic achieve drastic reductions in power consumption. More recently,
brain-inspired spiking neuromorphic chips have achieved even lower power
consumption, on the order of milliwatts, while still offering real-time
processing.
However, for deploying DNNs to energy efficient neuromorphic chips the
incompatibility between continuous neurons and synaptic weights of traditional
DNNs, discrete spiking neurons and synapses of neuromorphic chips need to be
overcome. Previous work has achieved this by training a network to learn
continuous probabilities, before it is deployed to a neuromorphic architecture,
such as IBM TrueNorth Neurosynaptic System, by random sampling these
probabilities.
The main contribution of this paper is a new learning algorithm that learns a
TrueNorth configuration ready for deployment. We achieve this by training
directly a binary hardware crossbar that accommodates the TrueNorth axon
configuration constrains and we propose a different neuron model.
Results of our approach trained on electroencephalogram (EEG) data show a
significant improvement with previous work (76% vs 86% accuracy) while
maintaining state of the art performance on the MNIST handwritten data set.Comment: IJCAI-2017. arXiv admin note: text overlap with arXiv:1605.0774
- …