4 research outputs found

    Ten thousand times faster: Classifying multidimensional data on a spiking neuromorphic hardware system.

    Get PDF
    Discrimination of sensory inputs is a computational task that biological neuronal systems perform very efficiently. Assessing the principles in those systems is a promising approach to develop technical solutions for many problems, such as data classification. A particular problem here is to train a classifier in a supervised fashion to discriminate classes in multidimensional data. We implemented a network of spiking neurons that solves this task using a neuromorphic hardware system, that is, analog neuronal circuits on a silicon substrate. This system enables us to do high-performance computation in a biologically inspired way, with spiking neurons as computational units. In this contribution, we illustrate solutions to technical challenges that occur when implementing a classifier on neuromorphic hardware. 

The network topology of the insect olfactory system provides a well suited template for a neuronal architecture processing multidimensional data. In our classifier network, the value of each dimension of a data vector determines the rate of a stochastically generated spike train. The spike trains are fed into non-overlapping populations of neurons. Those populations project onto an association layer with winner-take-all properties representing the output of the classifier. During classifier training, the weights in this projection are adjusted according to a firing-rate based learning rule. 

The values in multidimensional data sets are typically real numbers, but neuronal firing rates are restricted to values between zero and some maximal value. Hence, the data must be transformed into a positive, bounded representation. We achieved this by representing each data point as a vector of distances to a number of points in data space (“virtual receptors” [1]). The representation by virtual receptors inevitably introduces correlation between input dimensions. We reduced this correlation using lateral inhibition in the first neuronal layer, leading to a significant increase in classifier performance. We found that decorrelation was most efficient when we scaled the inhibitory weights according to the correlation between the connected populations. 

We ran our classifier network on a neuromorphic hardware system that runs at ten thousand times the speed of biological neurons, thus suited for high performance computing [2]. However, the considerable variance of rate-response sensitivity across hardware neurons decreased classification performance. We therefore developed a calibration routine to counteract the neuronal variance.

References

[1] Schmuker, M. and Schneider, G. (2007). Processing and classification of chemical data inspired by insect olfaction. Proc. Natl. Acad. Sci. U S A 104, 20285-20289. 
[2] Brüderle, D., Bill, J., Kaplan, B., Kremkow, J., Meier, K., Müller, E. and Schemmel, J. (2010). Simulator-like exploration of cortical network architectures with a mixed-signal VLSi system. In Proc. of IEEE Intern. Symp. on Circuits and Systems (ISCAS), 2784–8787

    The effect of heterogeneity on decorrelation mechanisms in spiking neural networks: a neuromorphic-hardware study

    Get PDF
    High-level brain function such as memory, classification or reasoning can be realized by means of recurrent networks of simplified model neurons. Analog neuromorphic hardware constitutes a fast and energy efficient substrate for the implementation of such neural computing architectures in technical applications and neuroscientific research. The functional performance of neural networks is often critically dependent on the level of correlations in the neural activity. In finite networks, correlations are typically inevitable due to shared presynaptic input. Recent theoretical studies have shown that inhibitory feedback, abundant in biological neural networks, can actively suppress these shared-input correlations and thereby enable neurons to fire nearly independently. For networks of spiking neurons, the decorrelating effect of inhibitory feedback has so far been explicitly demonstrated only for homogeneous networks of neurons with linear sub-threshold dynamics. Theory, however, suggests that the effect is a general phenomenon, present in any system with sufficient inhibitory feedback, irrespective of the details of the network structure or the neuronal and synaptic properties. Here, we investigate the effect of network heterogeneity on correlations in sparse, random networks of inhibitory neurons with non-linear, conductance-based synapses. Emulations of these networks on the analog neuromorphic hardware system Spikey allow us to test the efficiency of decorrelation by inhibitory feedback in the presence of hardware-specific heterogeneities. The configurability of the hardware substrate enables us to modulate the extent of heterogeneity in a systematic manner. We selectively study the effects of shared input and recurrent connections on correlations in membrane potentials and spike trains. Our results confirm ...Comment: 20 pages, 10 figures, supplement

    Exploring the potential of brain-inspired computing

    Get PDF
    The gap between brains and computers regarding both their cognitive capability and power efficiency is remarkably huge. Brains process information massively in parallel and its constituents are intrinsically self-organizing, while in digital computers the execution of instructions is deterministic and rather serial. The recent progress in the development of dedicated hardware systems implementing physical models of neurons and synapses enables to efficiently emulate spiking neural networks. In this work, we verify the design and explore the potential for brain-inspired computing of such an analog neuromorphic system, called Spikey. We demonstrate the versatility of this highly configurable substrate by the implementation of a rich repertoire of network models, including models for signal propagation and enhancement, general purpose classifiers, cortical models and decorrelating feedback systems. Network emulations on Spikey are highly accelerated and consume less than 1 nJ per synaptic transmission. The Spikey system, hence, outperforms modern desktop computers in terms of fast and efficient network simulations closing the gap to brains. During this thesis the stability, performance and user-friendliness of the Spikey system was improved integrating it into the neuroscientific tool chain and making it available for the community. The implementation of networks suitable to solve everyday tasks, like object or speech recognition, qualifies this technology to be an alternative to conventional computers. Considering the compactness, computational capability and power efficiency, neuromorphic systems may qualify as a valuable complement to classical computation
    corecore