1,246 research outputs found
Homogeneous Spiking Neuromorphic System for Real-World Pattern Recognition
A neuromorphic chip that combines CMOS analog spiking neurons and memristive
synapses offers a promising solution to brain-inspired computing, as it can
provide massive neural network parallelism and density. Previous hybrid analog
CMOS-memristor approaches required extensive CMOS circuitry for training, and
thus eliminated most of the density advantages gained by the adoption of
memristor synapses. Further, they used different waveforms for pre and
post-synaptic spikes that added undesirable circuit overhead. Here we describe
a hardware architecture that can feature a large number of memristor synapses
to learn real-world patterns. We present a versatile CMOS neuron that combines
integrate-and-fire behavior, drives passive memristors and implements
competitive learning in a compact circuit module, and enables in-situ
plasticity in the memristor synapses. We demonstrate handwritten-digits
recognition using the proposed architecture using transistor-level circuit
simulations. As the described neuromorphic architecture is homogeneous, it
realizes a fundamental building block for large-scale energy-efficient
brain-inspired silicon chips that could lead to next-generation cognitive
computing.Comment: This is a preprint of an article accepted for publication in IEEE
Journal on Emerging and Selected Topics in Circuits and Systems, vol 5, no.
2, June 201
Modeling networks of spiking neurons as interacting processes with memory of variable length
We consider a new class of non Markovian processes with a countable number of
interacting components, both in discrete and continuous time. Each component is
represented by a point process indicating if it has a spike or not at a given
time. The system evolves as follows. For each component, the rate (in
continuous time) or the probability (in discrete time) of having a spike
depends on the entire time evolution of the system since the last spike time of
the component. In discrete time this class of systems extends in a non trivial
way both Spitzer's interacting particle systems, which are Markovian, and
Rissanen's stochastic chains with memory of variable length which have finite
state space. In continuous time they can be seen as a kind of Rissanen's
variable length memory version of the class of self-exciting point processes
which are also called "Hawkes processes", however with infinitely many
components. These features make this class a good candidate to describe the
time evolution of networks of spiking neurons. In this article we present a
critical reader's guide to recent papers dealing with this class of models,
both in discrete and in continuous time. We briefly sketch results concerning
perfect simulation and existence issues, de-correlation between successive
interspike intervals, the longtime behavior of finite non-excited systems and
propagation of chaos in mean field systems
Frequency control in synchronized networks of inhibitory neurons
We analyze the control of frequency for a synchronized inhibitory neuronal
network. The analysis is done for a reduced membrane model with a
biophysically-based synaptic influence. We argue that such a reduced model can
quantitatively capture the frequency behavior of a larger class of neuronal
models. We show that in different parameter regimes, the network frequency
depends in different ways on the intrinsic and synaptic time constants. Only in
one portion of the parameter space, called `phasic', is the network period
proportional to the synaptic decay time. These results are discussed in
connection with previous work of the authors, which showed that for mildly
heterogeneous networks, the synchrony breaks down, but coherence is preserved
much more for systems in the phasic regime than in the other regimes. These
results imply that for mildly heterogeneous networks, the existence of a
coherent rhythm implies a linear dependence of the network period on synaptic
decay time, and a much weaker dependence on the drive to the cells. We give
experimental evidence for this conclusion.Comment: 18 pages, 3 figures, Kluwer.sty. J. Comp. Neurosci. (in press).
Originally submitted to the neuro-sys archive which was never publicly
announced (was 9803001
Desynchronizing effect of high-frequency stimulation in a generic cortical network model
Transcranial Electrical Stimulation (TCES) and Deep Brain Stimulation (DBS)
are two different applications of electrical current to the brain used in
different areas of medicine. Both have a similar frequency dependence of their
efficiency, with the most pronounced effects around 100Hz. We apply
superthreshold electrical stimulation, specifically depolarizing DC current,
interrupted at different frequencies, to a simple model of a population of
cortical neurons which uses phenomenological descriptions of neurons by
Izhikevich and synaptic connections on a similar level of sophistication. With
this model, we are able to reproduce the optimal desynchronization around
100Hz, as well as to predict the full frequency dependence of the efficiency of
desynchronization, and thereby to give a possible explanation for the action
mechanism of TCES.Comment: 9 pages, figs included. Accepted for publication in Cognitive
Neurodynamic
Multi-Stability and Pattern-Selection in Oscillatory Networks with Fast Inhibition and Electrical Synapses
A model or hybrid network consisting of oscillatory cells interconnected by inhibitory and electrical synapses may express different stable activity patterns without any change of network topology or parameters, and switching between the patterns can be induced by specific transient signals. However, little is known of properties of such signals. In the present study, we employ numerical simulations of neural networks of different size composed of relaxation oscillators, to investigate switching between in-phase (IP) and anti-phase (AP) activity patterns. We show that the time windows of susceptibility to switching between the patterns are similar in 2-, 4- and 6-cell fully-connected networks. Moreover, in a network (N = 4, 6) expressing a given AP pattern, a stimulus with a given profile consisting of depolarizing and hyperpolarizing signals sent to different subpopulations of cells can evoke switching to another AP pattern. Interestingly, the resulting pattern encodes the profile of the switching stimulus. These results can be extended to different network architectures. Indeed, relaxation oscillators are not only models of cellular pacemakers, bursting or spiking, but are also analogous to firing-rate models of neural activity. We show that rules of switching similar to those found for relaxation oscillators apply to oscillating circuits of excitatory cells interconnected by electrical synapses and cross-inhibition. Our results suggest that incoming information, arriving in a proper time window, may be stored in an oscillatory network in the form of a specific spatio-temporal activity pattern which is expressed until new pertinent information arrives
Complex Dynamics in Dedicated / Multifunctional Neural Networks and Chaotic Nonlinear Systems
We study complex behaviors arising in neuroscience and other nonlinear systems by combining dynamical systems analysis with modern computational approaches including GPU parallelization and unsupervised machine learning. To gain insights into the behaviors of brain networks and complex central pattern generators (CPGs), it is important to understand the dynamical principles regulating individual neurons as well as the basic structural and functional building blocks of neural networks. In the first section, we discuss how symbolic methods can help us analyze neural dynamics such as bursting, tonic spiking and chaotic mixed-mode oscillations in various models of individual neurons, the bifurcations that underlie transitions between activity types, as well as emergent network phenomena through synergistic interactions seen in realistic neural circuits, such as network bursting from non-intrinsic bursters. The second section is focused on the origin and coexistence of multistable rhythms in oscillatory neural networks of inhibitory coupled cells. We discuss how network connectivity and intrinsic properties of the cells affect the dynamics, and how even simple circuits can exhibit a variety of mono/multi-stable rhythms including pacemakers, half-center oscillators, multiple traveling-waves, fully synchronous states, as well as various chimeras. Our analyses can help generate verifiable hypotheses for neurophysiological experiments on central pattern generators. In the last section, we demonstrate the inter-disciplinary nature of this research through the applications of these techniques to identify the universal principles governing both simple and complex dynamics, and chaotic structure in diverse nonlinear systems. Using a classical example from nonlinear laser optics, we elaborate on the multiplicity and self-similarity of key organizing structures in 2D parameter space such as homoclinic and heteroclinic bifurcation curves, Bykov T-point spirals, and inclination flips. This is followed by detailed computational reconstructions of the spatial organization and 3D embedding of bifurcation surfaces, parametric saddles, and isolated closed curves (isolas). The generality of our modeling approaches could lead to novel methodologies and nonlinear science applications in biological, medical and engineering systems
- …