Spiking neural networks (SNNs) have demonstrated excellent capabilities in
various intelligent scenarios. Most existing methods for training SNNs are
based on the concept of synaptic plasticity; however, learning in the realistic
brain also utilizes intrinsic non-synaptic mechanisms of neurons. The spike
threshold of biological neurons is a critical intrinsic neuronal feature that
exhibits rich dynamics on a millisecond timescale and has been proposed as an
underlying mechanism that facilitates neural information processing. In this
study, we develop a novel synergistic learning approach that simultaneously
trains synaptic weights and spike thresholds in SNNs. SNNs trained with
synapse-threshold synergistic learning (STL-SNNs) achieve significantly higher
accuracies on various static and neuromorphic datasets than SNNs trained with
two single-learning models of the synaptic learning (SL) and the threshold
learning (TL). During training, the synergistic learning approach optimizes
neural thresholds, providing the network with stable signal transmission via
appropriate firing rates. Further analysis indicates that STL-SNNs are robust
to noisy data and exhibit low energy consumption for deep network structures.
Additionally, the performance of STL-SNN can be further improved by introducing
a generalized joint decision framework (JDF). Overall, our findings indicate
that biologically plausible synergies between synaptic and intrinsic
non-synaptic mechanisms may provide a promising approach for developing highly
efficient SNN learning methods.Comment: 13 pages, 9 figures, submitted for publicatio