1,031 research outputs found
Supervised Learning in Spiking Neural Networks with Phase-Change Memory Synapses
Spiking neural networks (SNN) are artificial computational models that have
been inspired by the brain's ability to naturally encode and process
information in the time domain. The added temporal dimension is believed to
render them more computationally efficient than the conventional artificial
neural networks, though their full computational capabilities are yet to be
explored. Recently, computational memory architectures based on non-volatile
memory crossbar arrays have shown great promise to implement parallel
computations in artificial and spiking neural networks. In this work, we
experimentally demonstrate for the first time, the feasibility to realize
high-performance event-driven in-situ supervised learning systems using
nanoscale and stochastic phase-change synapses. Our SNN is trained to recognize
audio signals of alphabets encoded using spikes in the time domain and to
generate spike trains at precise time instances to represent the pixel
intensities of their corresponding images. Moreover, with a statistical model
capturing the experimental behavior of the devices, we investigate
architectural and systems-level solutions for improving the training and
inference performance of our computational memory-based system. Combining the
computational potential of supervised SNNs with the parallel compute power of
computational memory, the work paves the way for next-generation of efficient
brain-inspired systems
Multi-scale Evolutionary Neural Architecture Search for Deep Spiking Neural Networks
Spiking Neural Networks (SNNs) have received considerable attention not only
for their superiority in energy efficient with discrete signal processing, but
also for their natural suitability to integrate multi-scale biological
plasticity. However, most SNNs directly adopt the structure of the
well-established DNN, rarely automatically design Neural Architecture Search
(NAS) for SNNs. The neural motifs topology, modular regional structure and
global cross-brain region connection of the human brain are the product of
natural evolution and can serve as a perfect reference for designing
brain-inspired SNN architecture. In this paper, we propose a Multi-Scale
Evolutionary Neural Architecture Search (MSE-NAS) for SNN, simultaneously
considering micro-, meso- and macro-scale brain topologies as the evolutionary
search space. MSE-NAS evolves individual neuron operation, self-organized
integration of multiple circuit motifs, and global connectivity across motifs
through a brain-inspired indirect evaluation function, Representational
Dissimilarity Matrices (RDMs). This training-free fitness function could
greatly reduce computational consumption and NAS's time, and its
task-independent property enables the searched SNNs to exhibit excellent
transferbility and scalability. Extensive experiments demonstrate that the
proposed algorithm achieves state-of-the-art (SOTA) performance with shorter
simulation steps on static datasets (CIFAR10, CIFAR100) and neuromorphic
datasets (CIFAR10-DVS and DVS128-Gesture). The thorough analysis also
illustrates the significant performance improvement and consistent
bio-interpretability deriving from the topological evolution at different
scales and the RDMs fitness function
Adaptive Sparse Structure Development with Pruning and Regeneration for Spiking Neural Networks
Spiking Neural Networks (SNNs) are more biologically plausible and
computationally efficient. Therefore, SNNs have the natural advantage of
drawing the sparse structural plasticity of brain development to alleviate the
energy problems of deep neural networks caused by their complex and fixed
structures. However, previous SNNs compression works are lack of in-depth
inspiration from the brain development plasticity mechanism. This paper
proposed a novel method for the adaptive structural development of SNN
(SD-SNN), introducing dendritic spine plasticity-based synaptic constraint,
neuronal pruning and synaptic regeneration. We found that synaptic constraint
and neuronal pruning can detect and remove a large amount of redundancy in
SNNs, coupled with synaptic regeneration can effectively prevent and repair
over-pruning. Moreover, inspired by the neurotrophic hypothesis, neuronal
pruning rate and synaptic regeneration rate were adaptively adjusted during the
learning-while-pruning process, which eventually led to the structural
stability of SNNs. Experimental results on spatial (MNIST, CIFAR-10) and
temporal neuromorphic (N-MNIST, DVS-Gesture) datasets demonstrate that our
method can flexibly learn appropriate compression rate for various tasks and
effectively achieve superior performance while massively reducing the network
energy consumption. Specifically, for the spatial MNIST dataset, our SD-SNN
achieves 99.51\% accuracy at the pruning rate 49.83\%, which has a 0.05\%
accuracy improvement compared to the baseline without compression. For the
neuromorphic DVS-Gesture dataset, 98.20\% accuracy with 1.09\% improvement is
achieved by our method when the compression rate reaches 55.50\%
- …