1,982 research outputs found
Supervised Learning in Spiking Neural Networks with Phase-Change Memory Synapses
Spiking neural networks (SNN) are artificial computational models that have
been inspired by the brain's ability to naturally encode and process
information in the time domain. The added temporal dimension is believed to
render them more computationally efficient than the conventional artificial
neural networks, though their full computational capabilities are yet to be
explored. Recently, computational memory architectures based on non-volatile
memory crossbar arrays have shown great promise to implement parallel
computations in artificial and spiking neural networks. In this work, we
experimentally demonstrate for the first time, the feasibility to realize
high-performance event-driven in-situ supervised learning systems using
nanoscale and stochastic phase-change synapses. Our SNN is trained to recognize
audio signals of alphabets encoded using spikes in the time domain and to
generate spike trains at precise time instances to represent the pixel
intensities of their corresponding images. Moreover, with a statistical model
capturing the experimental behavior of the devices, we investigate
architectural and systems-level solutions for improving the training and
inference performance of our computational memory-based system. Combining the
computational potential of supervised SNNs with the parallel compute power of
computational memory, the work paves the way for next-generation of efficient
brain-inspired systems
Inherent Weight Normalization in Stochastic Neural Networks
Multiplicative stochasticity such as Dropout improves the robustness and
generalizability of deep neural networks. Here, we further demonstrate that
always-on multiplicative stochasticity combined with simple threshold neurons
are sufficient operations for deep neural networks. We call such models Neural
Sampling Machines (NSM). We find that the probability of activation of the NSM
exhibits a self-normalizing property that mirrors Weight Normalization, a
previously studied mechanism that fulfills many of the features of Batch
Normalization in an online fashion. The normalization of activities during
training speeds up convergence by preventing internal covariate shift caused by
changes in the input distribution. The always-on stochasticity of the NSM
confers the following advantages: the network is identical in the inference and
learning phases, making the NSM suitable for online learning, it can exploit
stochasticity inherent to a physical substrate such as analog non-volatile
memories for in-memory computing, and it is suitable for Monte Carlo sampling,
while requiring almost exclusively addition and comparison operations. We
demonstrate NSMs on standard classification benchmarks (MNIST and CIFAR) and
event-based classification benchmarks (N-MNIST and DVS Gestures). Our results
show that NSMs perform comparably or better than conventional artificial neural
networks with the same architecture
Neural Sampling Machine with Stochastic Synapse allows Brain-like Learning and Inference
Many real-world mission-critical applications require continual online
learning from noisy data and real-time decision making with a defined
confidence level. Probabilistic models and stochastic neural networks can
explicitly handle uncertainty in data and allow adaptive learning-on-the-fly,
but their implementation in a low-power substrate remains a challenge. Here, we
introduce a novel hardware fabric that implements a new class of stochastic NN
called Neural-Sampling-Machine that exploits stochasticity in synaptic
connections for approximate Bayesian inference. Harnessing the inherent
non-linearities and stochasticity occurring at the atomic level in emerging
materials and devices allows us to capture the synaptic stochasticity occurring
at the molecular level in biological synapses. We experimentally demonstrate
in-silico hybrid stochastic synapse by pairing a ferroelectric field-effect
transistor -based analog weight cell with a two-terminal stochastic selector
element. Such a stochastic synapse can be integrated within the
well-established crossbar array architecture for compute-in-memory. We
experimentally show that the inherent stochastic switching of the selector
element between the insulator and metallic state introduces a multiplicative
stochastic noise within the synapses of NSM that samples the conductance states
of the FeFET, both during learning and inference. We perform network-level
simulations to highlight the salient automatic weight normalization feature
introduced by the stochastic synapses of the NSM that paves the way for
continual online learning without any offline Batch Normalization. We also
showcase the Bayesian inferencing capability introduced by the stochastic
synapse during inference mode, thus accounting for uncertainty in data. We
report 98.25%accuracy on standard image classification task as well as
estimation of data uncertainty in rotated samples
Accelerated physical emulation of Bayesian inference in spiking neural networks
The massively parallel nature of biological information processing plays an
important role for its superiority to human-engineered computing devices. In
particular, it may hold the key to overcoming the von Neumann bottleneck that
limits contemporary computer architectures. Physical-model neuromorphic devices
seek to replicate not only this inherent parallelism, but also aspects of its
microscopic dynamics in analog circuits emulating neurons and synapses.
However, these machines require network models that are not only adept at
solving particular tasks, but that can also cope with the inherent
imperfections of analog substrates. We present a spiking network model that
performs Bayesian inference through sampling on the BrainScaleS neuromorphic
platform, where we use it for generative and discriminative computations on
visual data. By illustrating its functionality on this platform, we implicitly
demonstrate its robustness to various substrate-specific distortive effects, as
well as its accelerated capability for computation. These results showcase the
advantages of brain-inspired physical computation and provide important
building blocks for large-scale neuromorphic applications.Comment: This preprint has been published 2019 November 14. Please cite as:
Kungl A. F. et al. (2019) Accelerated Physical Emulation of Bayesian
Inference in Spiking Neural Networks. Front. Neurosci. 13:1201. doi:
10.3389/fnins.2019.0120
- …