836 research outputs found
Nonnormal amplification in random balanced neuronal networks
In dynamical models of cortical networks, the recurrent connectivity can
amplify the input given to the network in two distinct ways. One is induced by
the presence of near-critical eigenvalues in the connectivity matrix W,
producing large but slow activity fluctuations along the corresponding
eigenvectors (dynamical slowing). The other relies on W being nonnormal, which
allows the network activity to make large but fast excursions along specific
directions. Here we investigate the tradeoff between nonnormal amplification
and dynamical slowing in the spontaneous activity of large random neuronal
networks composed of excitatory and inhibitory neurons. We use a Schur
decomposition of W to separate the two amplification mechanisms. Assuming
linear stochastic dynamics, we derive an exact expression for the expected
amount of purely nonnormal amplification. We find that amplification is very
limited if dynamical slowing must be kept weak. We conclude that, to achieve
strong transient amplification with little slowing, the connectivity must be
structured. We show that unidirectional connections between neurons of the same
type together with reciprocal connections between neurons of different types,
allow for amplification already in the fast dynamical regime. Finally, our
results also shed light on the differences between balanced networks in which
inhibition exactly cancels excitation, and those where inhibition dominates.Comment: 13 pages, 7 figure
Extracting non-linear integrate-and-fire models from experimental data using dynamic I–V curves
The dynamic I–V curve method was recently introduced for the efficient experimental generation of reduced neuron models. The method extracts the response properties of a neuron while it is subject to a naturalistic stimulus that mimics in vivo-like fluctuating synaptic drive. The resulting history-dependent, transmembrane current is then projected onto a one-dimensional current–voltage relation that provides the basis for a tractable non-linear integrate-and-fire model. An attractive feature of the method is that it can be used in spike-triggered mode to quantify the distinct patterns of post-spike refractoriness seen in different classes of cortical neuron. The method is first illustrated using a conductance-based model and is then applied experimentally to generate reduced models of cortical layer-5 pyramidal cells and interneurons, in injected-current and injected- conductance protocols. The resulting low-dimensional neuron models—of the refractory exponential integrate-and-fire type—provide highly accurate predictions for spike-times. The method therefore provides a useful tool for the construction of tractable models and rapid experimental classification of cortical neurons
The spike train statistics for consonant and dissonant musical accords
The simple system composed of three neural-like noisy elements is considered.
Two of them (sensory neurons or sensors) are stimulated by noise and periodic
signals with different ratio of frequencies, and the third one (interneuron)
receives the output of these two sensors and noise. We propose the analytical
approach to analysis of Interspike Intervals (ISI) statistics of the spike
train generated by the interneuron. The ISI distributions of the sensory
neurons are considered to be known. The frequencies of the input sinusoidal
signals are in ratios, which are usual for music. We show that in the case of
small integer ratios (musical consonance) the input pair of sinusoids results
in the ISI distribution appropriate for more regular output spike train than in
a case of large integer ratios (musical dissonance) of input frequencies. These
effects are explained from the viewpoint of the proposed theory.Comment: 22 pages, 6 figure
Generalized Rate-Code Model for Neuron Ensembles with Finite Populations
We have proposed a generalized Langevin-type rate-code model subjected to
multiplicative noise, in order to study stationary and dynamical properties of
an ensemble containing {\it finite} neurons. Calculations using the
Fokker-Planck equation (FPE) have shown that owing to the multiplicative noise,
our rate model yields various kinds of stationary non-Gaussian distributions
such as gamma, inverse-Gaussian-like and log-normal-like distributions, which
have been experimentally observed. Dynamical properties of the rate model have
been studied with the use of the augmented moment method (AMM), which was
previously proposed by the author with a macroscopic point of view for
finite-unit stochastic systems. In the AMM, original -dimensional stochastic
differential equations (DEs) are transformed into three-dimensional
deterministic DEs for means and fluctuations of local and global variables.
Dynamical responses of the neuron ensemble to pulse and sinusoidal inputs
calculated by the AMM are in good agreement with those obtained by direct
simulation. The synchronization in the neuronal ensemble is discussed.
Variabilities of the firing rate and of the interspike interval (ISI) are shown
to increase with increasing the magnitude of multiplicative noise, which may be
a conceivable origin of the observed large variability in cortical neurons.Comment: 19 pages, 9 figures, accepted in Phys. Rev. E after minor
modification
Dynamical response of the Hodgkin-Huxley model in the high-input regime
The response of the Hodgkin-Huxley neuronal model subjected to stochastic
uncorrelated spike trains originating from a large number of inhibitory and
excitatory post-synaptic potentials is analyzed in detail. The model is
examined in its three fundamental dynamical regimes: silence, bistability and
repetitive firing. Its response is characterized in terms of statistical
indicators (interspike-interval distributions and their first moments) as well
as of dynamical indicators (autocorrelation functions and conditional
entropies). In the silent regime, the coexistence of two different coherence
resonances is revealed: one occurs at quite low noise and is related to the
stimulation of subthreshold oscillations around the rest state; the second one
(at intermediate noise variance) is associated with the regularization of the
sequence of spikes emitted by the neuron. Bistability in the low noise limit
can be interpreted in terms of jumping processes across barriers activated by
stochastic fluctuations. In the repetitive firing regime a maximization of
incoherence is observed at finite noise variance. Finally, the mechanisms
responsible for spike triggering in the various regimes are clearly identified.Comment: 14 pages, 24 figures in eps, submitted to Physical Review
Rhythmogenic neuronal networks, pacemakers, and k-cores
Neuronal networks are controlled by a combination of the dynamics of
individual neurons and the connectivity of the network that links them
together. We study a minimal model of the preBotzinger complex, a small
neuronal network that controls the breathing rhythm of mammals through periodic
firing bursts. We show that the properties of a such a randomly connected
network of identical excitatory neurons are fundamentally different from those
of uniformly connected neuronal networks as described by mean-field theory. We
show that (i) the connectivity properties of the networks determines the
location of emergent pacemakers that trigger the firing bursts and (ii) that
the collective desensitization that terminates the firing bursts is determined
again by the network connectivity, through k-core clusters of neurons.Comment: 4+ pages, 4 figures, submitted to Phys. Rev. Let
Random Walks for Spike-Timing Dependent Plasticity
Random walk methods are used to calculate the moments of negative image
equilibrium distributions in synaptic weight dynamics governed by spike-timing
dependent plasticity (STDP). The neural architecture of the model is based on
the electrosensory lateral line lobe (ELL) of mormyrid electric fish, which
forms a negative image of the reafferent signal from the fish's own electric
discharge to optimize detection of sensory electric fields. Of particular
behavioral importance to the fish is the variance of the equilibrium
postsynaptic potential in the presence of noise, which is determined by the
variance of the equilibrium weight distribution. Recurrence relations are
derived for the moments of the equilibrium weight distribution, for arbitrary
postsynaptic potential functions and arbitrary learning rules. For the case of
homogeneous network parameters, explicit closed form solutions are developed
for the covariances of the synaptic weight and postsynaptic potential
distributions.Comment: 18 pages, 8 figures, 15 subfigures; uses revtex4, subfigure, amsmat
Comparison of Langevin and Markov channel noise models for neuronal signal generation
The stochastic opening and closing of voltage-gated ion channels produces
noise in neurons. The effect of this noise on the neuronal performance has been
modelled using either approximate or Langevin model, based on stochastic
differential equations or an exact model, based on a Markov process model of
channel gating. Yet whether the Langevin model accurately reproduces the
channel noise produced by the Markov model remains unclear. Here we present a
comparison between Langevin and Markov models of channel noise in neurons using
single compartment Hodgkin-Huxley models containing either and
, or only voltage-gated ion channels. The performance of the
Langevin and Markov models was quantified over a range of stimulus statistics,
membrane areas and channel numbers. We find that in comparison to the Markov
model, the Langevin model underestimates the noise contributed by voltage-gated
ion channels, overestimating information rates for both spiking and non-spiking
membranes. Even with increasing numbers of channels the difference between the
two models persists. This suggests that the Langevin model may not be suitable
for accurately simulating channel noise in neurons, even in simulations with
large numbers of ion channels
Adaptive multi‐index collocation for uncertainty quantification and sensitivity analysis
Peer Reviewedhttps://deepblue.lib.umich.edu/bitstream/2027.42/154316/1/nme6268.pdfhttps://deepblue.lib.umich.edu/bitstream/2027.42/154316/2/NME_6268_novelty.pdfhttps://deepblue.lib.umich.edu/bitstream/2027.42/154316/3/nme6268_am.pd
A comparative study of different integrate-and-fire neurons: spontaneous activity, dynamical response, and stimulus-induced correlation
Stochastic integrate-and-fire (IF) neuron models have found widespread
applications in computational neuroscience. Here we present results on the
white-noise-driven perfect, leaky, and quadratic IF models, focusing on the
spectral statistics (power spectra, cross spectra, and coherence functions) in
different dynamical regimes (noise-induced and tonic firing regimes with low or
moderate noise). We make the models comparable by tuning parameters such that
the mean value and the coefficient of variation of the interspike interval
match for all of them. We find that, under these conditions, the power spectrum
under white-noise stimulation is often very similar while the response
characteristics, described by the cross spectrum between a fraction of the
input noise and the output spike train, can differ drastically. We also
investigate how the spike trains of two neurons of the same kind (e.g. two
leaky IF neurons) correlate if they share a common noise input. We show that,
depending on the dynamical regime, either two quadratic IF models or two leaky
IFs are more strongly correlated. Our results suggest that, when choosing among
simple IF models for network simulations, the details of the model have a
strong effect on correlation and regularity of the output.Comment: 12 page
- …