2,152 research outputs found

    Memory and information processing in neuromorphic systems

    Full text link
    A striking difference between brain-inspired neuromorphic processors and current von Neumann processors architectures is the way in which memory and processing is organized. As Information and Communication Technologies continue to address the need for increased computational power through the increase of cores within a digital processor, neuromorphic engineers and scientists can complement this need by building processor architectures where memory is distributed with the processing. In this paper we present a survey of brain-inspired processor architectures that support models of cortical networks and deep neural networks. These architectures range from serial clocked implementations of multi-neuron systems to massively parallel asynchronous ones and from purely digital systems to mixed analog/digital systems which implement more biological-like models of neurons and synapses together with a suite of adaptation and learning mechanisms analogous to the ones found in biological nervous systems. We describe the advantages of the different approaches being pursued and present the challenges that need to be addressed for building artificial neural processing systems that can display the richness of behaviors seen in biological systems.Comment: Submitted to Proceedings of IEEE, review of recently proposed neuromorphic computing platforms and system

    An efficient automated parameter tuning framework for spiking neural networks

    Get PDF
    As the desire for biologically realistic spiking neural networks (SNNs) increases, tuning the enormous number of open parameters in these models becomes a difficult challenge. SNNs have been used to successfully model complex neural circuits that explore various neural phenomena such as neural plasticity, vision systems, auditory systems, neural oscillations, and many other important topics of neural function. Additionally, SNNs are particularly well-adapted to run on neuromorphic hardware that will support biological brain-scale architectures. Although the inclusion of realistic plasticity equations, neural dynamics, and recurrent topologies has increased the descriptive power of SNNs, it has also made the task of tuning these biologically realistic SNNs difficult. To meet this challenge, we present an automated parameter tuning framework capable of tuning SNNs quickly and efficiently using evolutionary algorithms (EA) and inexpensive, readily accessible graphics processing units (GPUs). A sample SNN with 4104 neurons was tuned to give V1 simple cell-like tuning curve responses and produce self-organizing receptive fields (SORFs) when presented with a random sequence of counterphase sinusoidal grating stimuli. A performance analysis comparing the GPU-accelerated implementation to a single-threaded central processing unit (CPU) implementation was carried out and showed a speedup of 65× of the GPU implementation over the CPU implementation, or 0.35 h per generation for GPU vs. 23.5 h per generation for CPU. Additionally, the parameter value solutions found in the tuned SNN were studied and found to be stable and repeatable. The automated parameter tuning framework presented here will be of use to both the computational neuroscience and neuromorphic engineering communities, making the process of constructing and tuning large-scale SNNs much quicker and easier

    Stochastic Resonance of Ensemble Neurons for Transient Spike Trains: A Wavelet Analysis

    Full text link
    By using the wavelet transformation (WT), we have analyzed the response of an ensemble of NN (=1, 10, 100 and 500) Hodgkin-Huxley (HH) neurons to {\it transient} MM-pulse spike trains (M=13M=1-3) with independent Gaussian noises. The cross-correlation between the input and output signals is expressed in terms of the WT expansion coefficients. The signal-to-noise ratio (SNR) is evaluated by using the {\it denoising} method within the WT, by which the noise contribution is extracted from output signals. Although the response of a single (N=1) neuron to sub-threshold transient signals with noises is quite unreliable, the transmission fidelity assessed by the cross-correlation and SNR is shown to be much improved by increasing the value of NN: a population of neurons play an indispensable role in the stochastic resonance (SR) for transient spike inputs. It is also shown that in a large-scale ensemble, the transmission fidelity for supra-threshold transient spikes is not significantly degraded by a weak noise which is responsible to SR for sub-threshold inputs.Comment: 20 pages, 4 figure

    Automated optimization of a reduced layer 5 pyramidal cell model based on experimental data.

    Get PDF
    The construction of compartmental models of neurons involves tuning a set of parameters to make the model neuron behave as realistically as possible. While the parameter space of single-compartment models or other simple models can be exhaustively searched, the introduction of dendritic geometry causes the number of parameters to balloon. As parameter tuning is a daunting and time-consuming task when performed manually, reliable methods for automatically optimizing compartmental models are desperately needed, as only optimized models can capture the behavior of real neurons. Here we present a three-step strategy to automatically build reduced models of layer 5 pyramidal neurons that closely reproduce experimental data. First, we reduce the pattern of dendritic branches of a detailed model to a set of equivalent primary dendrites. Second, the ion channel densities are estimated using a multi-objective optimization strategy to fit the voltage trace recorded under two conditions - with and without the apical dendrite occluded by pinching. Finally, we tune dendritic calcium channel parameters to model the initiation of dendritic calcium spikes and the coupling between soma and dendrite. More generally, this new method can be applied to construct families of models of different neuron types, with applications ranging from the study of information processing in single neurons to realistic simulations of large-scale network dynamics

    The Islands Project for Managing Populations in Genetic Training of Spiking Neural Networks

    Get PDF
    The TENNLab software framework enables researchers to explore spiking neuroprocessors, neuromorphic applications and how they are trained. The centerpiece of training in TENNLab has been a genetic algorithm called Evolutionary Optimization For Neuromorphic System (EONS). EONS optimizes a single population of spiking neural networks, and heretofore, many methods to train with multiple populations have been ad hoc, typically consisting of shell scripts that execute multiple independent EONS jobs, whose results are combined and analyzed in another ad hoc fashion. The Islands project seeks to manage and manipulate multiple EONS populations in a controlled way. With Islands, one may spawn off independent EONS populations, each of which is an “Island.” One may define characteristics of a “stagnated” island, where further optimization is unlikely to improve the fitness of the population on the island. The Island software then allows one to create new islands by combining stagnated islands, or to migrate populations from one island to others, all in an attempt to increase diversity among the populations to improve their fitness. This thesis describes the software structure of Islands, its interface, and the functionalities that it implements. We then perform a case study with three neuromorphic control applications that demonstrate the wide variety of features of Islands

    A point process framework for modeling electrical stimulation of the auditory nerve

    Full text link
    Model-based studies of auditory nerve responses to electrical stimulation can provide insight into the functioning of cochlear implants. Ideally, these studies can identify limitations in sound processing strategies and lead to improved methods for providing sound information to cochlear implant users. To accomplish this, models must accurately describe auditory nerve spiking while avoiding excessive complexity that would preclude large-scale simulations of populations of auditory nerve fibers and obscure insight into the mechanisms that influence neural encoding of sound information. In this spirit, we develop a point process model of the auditory nerve that provides a compact and accurate description of neural responses to electric stimulation. Inspired by the framework of generalized linear models, the proposed model consists of a cascade of linear and nonlinear stages. We show how each of these stages can be associated with biophysical mechanisms and related to models of neuronal dynamics. Moreover, we derive a semi-analytical procedure that uniquely determines each parameter in the model on the basis of fundamental statistics from recordings of single fiber responses to electric stimulation, including threshold, relative spread, jitter, and chronaxie. The model also accounts for refractory and summation effects that influence the responses of auditory nerve fibers to high pulse rate stimulation. Throughout, we compare model predictions to published physiological data and explain differences in auditory nerve responses to high and low pulse rate stimulation. We close by performing an ideal observer analysis of simulated spike trains in response to sinusoidally amplitude modulated stimuli and find that carrier pulse rate does not affect modulation detection thresholds.Comment: 1 title page, 27 manuscript pages, 14 figures, 1 table, 1 appendi
    corecore