5 research outputs found
Efficient Computation in Adaptive Artificial Spiking Neural Networks
Artificial Neural Networks (ANNs) are bio-inspired models of neural
computation that have proven highly effective. Still, ANNs lack a natural
notion of time, and neural units in ANNs exchange analog values in a
frame-based manner, a computationally and energetically inefficient form of
communication. This contrasts sharply with biological neurons that communicate
sparingly and efficiently using binary spikes. While artificial Spiking Neural
Networks (SNNs) can be constructed by replacing the units of an ANN with
spiking neurons, the current performance is far from that of deep ANNs on hard
benchmarks and these SNNs use much higher firing rates compared to their
biological counterparts, limiting their efficiency. Here we show how spiking
neurons that employ an efficient form of neural coding can be used to construct
SNNs that match high-performance ANNs and exceed state-of-the-art in SNNs on
important benchmarks, while requiring much lower average firing rates. For
this, we use spike-time coding based on the firing rate limiting adaptation
phenomenon observed in biological spiking neurons. This phenomenon can be
captured in adapting spiking neuron models, for which we derive the effective
transfer function. Neural units in ANNs trained with this transfer function can
be substituted directly with adaptive spiking neurons, and the resulting
Adaptive SNNs (AdSNNs) can carry out inference in deep neural networks using up
to an order of magnitude fewer spikes compared to previous SNNs. Adaptive
spike-time coding additionally allows for the dynamic control of neural coding
precision: we show how a simple model of arousal in AdSNNs further halves the
average required firing rate and this notion naturally extends to other forms
of attention. AdSNNs thus hold promise as a novel and efficient model for
neural computation that naturally fits to temporally continuous and
asynchronous applications
Gating sensory noise in a spiking subtractive LSTM
Spiking neural networks are being investigated both as biologically plausible models of neural computation and also as a potentially more efficient type of neural network. Recurrent neural networks in the form of networks of gating memory cells have been central in state-of-the-art solutions in problem domains that involve sequence recognition or generation. Here, we design an analog Long Short-Term Memory (LSTM) cell where its neurons can be substituted with efficient spiking neurons, where we use subtractive gating (following the subLSTM in [1]) instead of multiplicative gating. Subtractive gating allows for a less sensitive gating mechanism, critical when using spiking neurons. By using fast adapting spiking neurons with a smoothed Rectified Linear Unit (ReLU)-like effective activation function, we show that then an accurate conversion from an analog subLSTM to a continuous-time spiking subLSTM is possible. This architecture results in memory networks that compute very efficiently, with low average firing rates comparable to those in biological neurons, while operating in continuous time
Sparse computation in adaptive spiking neural networks
Artificial Neural Networks (ANNs) are bio-inspired models of neural computation that have proven highly effective. Still, ANNs lack a natural notion of time, and neural units in ANNs exchange analog values in a frame-based manner, a computationally and energetically inefficient form of communication. This contrasts sharply with biological neurons that communicate sparingly and efficiently using isomorphic binary spikes. While Spiking Neural Networks (SNNs) can be constructed by replacing the units of an ANN with spiking neurons (Cao et al., 2015; Diehl et al., 2015) to obtain reasonable performance, these SNNs use Poisson spiking mechanisms with exceedingly high firing rates compared to their biological counterparts. Here we show how spiking neurons that employ a form of neural coding can be used to construct SNNs that match high-performance ANNs and match or exceed state-of-the-art in SNNs on important benchmarks, while requiring firing rates compatible with biological findings. For this, we use spike-based coding based on the firing rate limiting adaptation phenomenon observed in biological spiking neurons. This phenomenon can be captured in fast adapting spiking neuron models, for which we derive the effective transfer function. Neural units in ANNs trained with this transfer function can be substituted directly with adaptive spiking neurons, and the resulting Adaptive SNNs (AdSNNs) can carry out competitive classification in deep neural networks without further modifications. Adaptive spike-based coding additionally allows for the dynamic control of neural coding precision: we show empirically how a simple model of arousal in AdSNNs further halves the average required firing rate and this notion naturally extends to other forms of attention as studied in neuroscience. AdSNNs thus hold promise as a novel and sparsely active model for neural computation that naturally fits to temporally continuous and asynchronous applications
Sparse Computation in Adaptive Spiking Neural Networks
Artificial Neural Networks (ANNs) are bio-inspired models of neural computation that have proven highly effective. Still, ANNs lack a natural notion of time, and neural units in ANNs exchange analog values in a frame-based manner, a computationally and energetically inefficient form of communication. This contrasts sharply with biological neurons that communicate sparingly and efficiently using isomorphic binary spikes. While Spiking Neural Networks (SNNs) can be constructed by replacing the units of an ANN with spiking neurons (Cao et al., 2015; Diehl et al., 2015) to obtain reasonable performance, these SNNs use Poisson spiking mechanisms with exceedingly high firing rates compared to their biological counterparts. Here we show how spiking neurons that employ a form of neural coding can be used to construct SNNs that match high-performance ANNs and match or exceed state-of-the-art in SNNs on important benchmarks, while requiring firing rates compatible with biological findings. For this, we use spike-based coding based on the firing rate limiting adaptation phenomenon observed in biological spiking neurons. This phenomenon can be captured in fast adapting spiking neuron models, for which we derive the effective transfer function. Neural units in ANNs trained with this transfer function can be substituted directly with adaptive spiking neurons, and the resulting Adaptive SNNs (AdSNNs) can carry out competitive classification in deep neural networks without further modifications. Adaptive spike-based coding additionally allows for the dynamic control of neural coding precision: we show empirically how a simple model of arousal in AdSNNs further halves the average required firing rate and this notion naturally extends to other forms of attention as studied in neuroscience. AdSNNs thus hold promise as a novel and sparsely active model for neural computation that naturally fits to temporally continuous and asynchronous applications