117 research outputs found
Supervised Learning in Multilayer Spiking Neural Networks
The current article introduces a supervised learning algorithm for multilayer
spiking neural networks. The algorithm presented here overcomes some
limitations of existing learning algorithms as it can be applied to neurons
firing multiple spikes and it can in principle be applied to any linearisable
neuron model. The algorithm is applied successfully to various benchmarks, such
as the XOR problem and the Iris data set, as well as complex classifications
problems. The simulations also show the flexibility of this supervised learning
algorithm which permits different encodings of the spike timing patterns,
including precise spike trains encoding.Comment: 38 pages, 4 figure
SuperSpike: Supervised learning in multi-layer spiking neural networks
A vast majority of computation in the brain is performed by spiking neural
networks. Despite the ubiquity of such spiking, we currently lack an
understanding of how biological spiking neural circuits learn and compute
in-vivo, as well as how we can instantiate such capabilities in artificial
spiking circuits in-silico. Here we revisit the problem of supervised learning
in temporally coding multi-layer spiking neural networks. First, by using a
surrogate gradient approach, we derive SuperSpike, a nonlinear voltage-based
three factor learning rule capable of training multi-layer networks of
deterministic integrate-and-fire neurons to perform nonlinear computations on
spatiotemporal spike patterns. Second, inspired by recent results on feedback
alignment, we compare the performance of our learning rule under different
credit assignment strategies for propagating output errors to hidden units.
Specifically, we test uniform, symmetric and random feedback, finding that
simpler tasks can be solved with any type of feedback, while more complex tasks
require symmetric feedback. In summary, our results open the door to obtaining
a better scientific understanding of learning and computation in spiking neural
networks by advancing our ability to train them to solve nonlinear problems
involving transformations between different spatiotemporal spike-time patterns
CDNA-SNN: A New Spiking Neural Network for Pattern Classification using Neuronal Assemblies
Spiking neural networks (SNNs) mimic their biological counterparts more closely than their predecessors and are considered the third generation of artificial neural networks. It has been proven that networks of spiking neurons have a higher computational capacity and lower power requirements than sigmoidal neural networks. This paper introduces a new type of spiking neural network that draws inspiration and incorporates concepts from neuronal assemblies in the human brain. The proposed network, termed as CDNA-SNN, assigns each neuron learnable values known as Class-Dependent Neuronal Activations (CDNAs) which indicate the neuron’s average relative spiking activity in response to samples from different classes. A new learning algorithm that categorizes the neurons into different class assemblies based on their CDNAs is also presented. These neuronal assemblies are trained via a novel training method based on Spike-Timing Dependent Plasticity (STDP) to have high activity for their associated class and low firing rate for other classes. Also, using CDNAs, a new type of STDP that controls the amount of plasticity based on the assemblies of pre- and post-synaptic neurons is proposed. The performance of CDNA-SNN is evaluated on five datasets from the UCI machine learning repository, as well as MNIST and Fashion MNIST, using nested cross-validation for hyperparameter optimization. Our results show that CDNA-SNN significantly outperforms SWAT (p<0.0005) and SpikeProp (p<0.05) on 3/5 and SRESN (p<0.05) on 2/5 UCI datasets while using the significantly lower number of trainable parameters. Furthermore, compared to other supervised, fully connected SNNs, the proposed SNN reaches the best performance for Fashion MNIST and comparable performance for MNIST and N-MNIST, also utilizing much less (1-35%) parameters
Supervised learning in Spiking Neural Networks with Limited Precision: SNN/LP
A new supervised learning algorithm, SNN/LP, is proposed for Spiking Neural
Networks. This novel algorithm uses limited precision for both synaptic weights
and synaptic delays; 3 bits in each case. Also a genetic algorithm is used for
the supervised training. The results are comparable or better than previously
published work. The results are applicable to the realization of large scale
hardware neural networks. One of the trained networks is implemented in
programmable hardware.Comment: 7 pages, originally submitted to IJCNN 201
- …