4,879 research outputs found
Simulating Cortical Feedback Modulation as Changes in Excitation and Inhibition in a Cortical Circuit Model.
Cortical feedback pathways are hypothesized to distribute context-dependent signals during flexible behavior. Recent experimental work has attempted to understand the mechanisms by which cortical feedback inputs modulate their target regions. Within the mouse whisker sensorimotor system, cortical feedback stimulation modulates spontaneous activity and sensory responsiveness, leading to enhanced sensory representations. However, the cellular mechanisms underlying these effects are currently unknown. In this study we use a simplified neural circuit model, which includes two recurrent excitatory populations and global inhibition, to simulate cortical modulation. First, we demonstrate how changes in the strengths of excitation and inhibition alter the input-output processing responses of our model. Second, we compare these responses with experimental findings from cortical feedback stimulation. Our analyses predict that enhanced inhibition underlies the changes in spontaneous and sensory evoked activity observed experimentally. More generally, these analyses provide a framework for relating cellular and synaptic properties to emergent circuit function and dynamic modulation
Existence and Stability of Standing Pulses in Neural Networks : I Existence
We consider the existence of standing pulse solutions of a neural network
integro-differential equation. These pulses are bistable with the zero state
and may be an analogue for short term memory in the brain. The network consists
of a single-layer of neurons synaptically connected by lateral inhibition. Our
work extends the classic Amari result by considering a non-saturating gain
function. We consider a specific connectivity function where the existence
conditions for single-pulses can be reduced to the solution of an algebraic
system. In addition to the two localized pulse solutions found by Amari, we
find that three or more pulses can coexist. We also show the existence of
nonconvex ``dimpled'' pulses and double pulses. We map out the pulse shapes and
maximum firing rates for different connection weights and gain functions.Comment: 31 pages, 29 figures, submitted to SIAM Journal on Applied Dynamical
System
Real time unsupervised learning of visual stimuli in neuromorphic VLSI systems
Neuromorphic chips embody computational principles operating in the nervous
system, into microelectronic devices. In this domain it is important to
identify computational primitives that theory and experiments suggest as
generic and reusable cognitive elements. One such element is provided by
attractor dynamics in recurrent networks. Point attractors are equilibrium
states of the dynamics (up to fluctuations), determined by the synaptic
structure of the network; a `basin' of attraction comprises all initial states
leading to a given attractor upon relaxation, hence making attractor dynamics
suitable to implement robust associative memory. The initial network state is
dictated by the stimulus, and relaxation to the attractor state implements the
retrieval of the corresponding memorized prototypical pattern. In a previous
work we demonstrated that a neuromorphic recurrent network of spiking neurons
and suitably chosen, fixed synapses supports attractor dynamics. Here we focus
on learning: activating on-chip synaptic plasticity and using a theory-driven
strategy for choosing network parameters, we show that autonomous learning,
following repeated presentation of simple visual stimuli, shapes a synaptic
connectivity supporting stimulus-selective attractors. Associative memory
develops on chip as the result of the coupled stimulus-driven neural activity
and ensuing synaptic dynamics, with no artificial separation between learning
and retrieval phases.Comment: submitted to Scientific Repor
Associative memory of phase-coded spatiotemporal patterns in leaky Integrate and Fire networks
We study the collective dynamics of a Leaky Integrate and Fire network in
which precise relative phase relationship of spikes among neurons are stored,
as attractors of the dynamics, and selectively replayed at differentctime
scales. Using an STDP-based learning process, we store in the connectivity
several phase-coded spike patterns, and we find that, depending on the
excitability of the network, different working regimes are possible, with
transient or persistent replay activity induced by a brief signal. We introduce
an order parameter to evaluate the similarity between stored and recalled
phase-coded pattern, and measure the storage capacity. Modulation of spiking
thresholds during replay changes the frequency of the collective oscillation or
the number of spikes per cycle, keeping preserved the phases relationship. This
allows a coding scheme in which phase, rate and frequency are dissociable.
Robustness with respect to noise and heterogeneity of neurons parameters is
studied, showing that, since dynamics is a retrieval process, neurons preserve
stablecprecise phase relationship among units, keeping a unique frequency of
oscillation, even in noisy conditions and with heterogeneity of internal
parameters of the units
Metabifurcation analysis of a mean field model of the cortex
Mean field models (MFMs) of cortical tissue incorporate salient features of
neural masses to model activity at the population level. One of the common
aspects of MFM descriptions is the presence of a high dimensional parameter
space capturing neurobiological attributes relevant to brain dynamics. We study
the physiological parameter space of a MFM of electrocortical activity and
discover robust correlations between physiological attributes of the model
cortex and its dynamical features. These correlations are revealed by the study
of bifurcation plots, which show that the model responses to changes in
inhibition belong to two families. After investigating and characterizing
these, we discuss their essential differences in terms of four important
aspects: power responses with respect to the modeled action of anesthetics,
reaction to exogenous stimuli, distribution of model parameters and oscillatory
repertoires when inhibition is enhanced. Furthermore, while the complexity of
sustained periodic orbits differs significantly between families, we are able
to show how metamorphoses between the families can be brought about by
exogenous stimuli. We unveil links between measurable physiological attributes
of the brain and dynamical patterns that are not accessible by linear methods.
They emerge when the parameter space is partitioned according to bifurcation
responses. This partitioning cannot be achieved by the investigation of only a
small number of parameter sets, but is the result of an automated bifurcation
analysis of a representative sample of 73,454 physiologically admissible sets.
Our approach generalizes straightforwardly and is well suited to probing the
dynamics of other models with large and complex parameter spaces
Acetylcholine neuromodulation in normal and abnormal learning and memory: vigilance control in waking, sleep, autism, amnesia, and Alzheimer's disease
This article provides a unified mechanistic neural explanation of how learning, recognition, and cognition break down during Alzheimer's disease, medial temporal amnesia, and autism. It also clarifies whey there are often sleep disturbances during these disorders. A key mechanism is how acetylcholine modules vigilance control in cortical layer
State-Dependent Computation Using Coupled Recurrent Networks
Although conditional branching between possible behavioral states is a hallmark of intelligent behavior, very little is known about the neuronal mechanisms that support this processing. In a step toward solving this problem, we demonstrate by theoretical analysis and simulation how
networks of richly interconnected neurons, such as those observed in the superficial layers of the neocortex, can embed reliable, robust finite state machines. We show how a multistable neuronal network containing a number of states can be created very simply by coupling two recurrent
networks whose synaptic weights have been configured for soft winner-take-all (sWTA) performance. These two sWTAs have simple, homogeneous, locally recurrent connectivity except for a small fraction of recurrent cross-connections between them, which are used to embed the required states. This coupling between the maps allows the network to continue to express the current state even after the input that elicited that state iswithdrawn. In addition, a small number of transition neurons implement the necessary input-driven transitions between the embedded states. We provide simple rules to systematically design and construct neuronal state machines of this kind. The significance of our finding is that it offers a method whereby the cortex could construct networks supporting a broad range of sophisticated processing by applying only small specializations to the same generic neuronal circuit
- …