879 research outputs found

    Nonnormal amplification in random balanced neuronal networks

    Get PDF
    In dynamical models of cortical networks, the recurrent connectivity can amplify the input given to the network in two distinct ways. One is induced by the presence of near-critical eigenvalues in the connectivity matrix W, producing large but slow activity fluctuations along the corresponding eigenvectors (dynamical slowing). The other relies on W being nonnormal, which allows the network activity to make large but fast excursions along specific directions. Here we investigate the tradeoff between nonnormal amplification and dynamical slowing in the spontaneous activity of large random neuronal networks composed of excitatory and inhibitory neurons. We use a Schur decomposition of W to separate the two amplification mechanisms. Assuming linear stochastic dynamics, we derive an exact expression for the expected amount of purely nonnormal amplification. We find that amplification is very limited if dynamical slowing must be kept weak. We conclude that, to achieve strong transient amplification with little slowing, the connectivity must be structured. We show that unidirectional connections between neurons of the same type together with reciprocal connections between neurons of different types, allow for amplification already in the fast dynamical regime. Finally, our results also shed light on the differences between balanced networks in which inhibition exactly cancels excitation, and those where inhibition dominates.Comment: 13 pages, 7 figure

    Extracting non-linear integrate-and-fire models from experimental data using dynamic I–V curves

    Get PDF
    The dynamic I–V curve method was recently introduced for the efficient experimental generation of reduced neuron models. The method extracts the response properties of a neuron while it is subject to a naturalistic stimulus that mimics in vivo-like fluctuating synaptic drive. The resulting history-dependent, transmembrane current is then projected onto a one-dimensional current–voltage relation that provides the basis for a tractable non-linear integrate-and-fire model. An attractive feature of the method is that it can be used in spike-triggered mode to quantify the distinct patterns of post-spike refractoriness seen in different classes of cortical neuron. The method is first illustrated using a conductance-based model and is then applied experimentally to generate reduced models of cortical layer-5 pyramidal cells and interneurons, in injected-current and injected- conductance protocols. The resulting low-dimensional neuron models—of the refractory exponential integrate-and-fire type—provide highly accurate predictions for spike-times. The method therefore provides a useful tool for the construction of tractable models and rapid experimental classification of cortical neurons

    Desynchronization in diluted neural networks

    Full text link
    The dynamical behaviour of a weakly diluted fully-inhibitory network of pulse-coupled spiking neurons is investigated. Upon increasing the coupling strength, a transition from regular to stochastic-like regime is observed. In the weak-coupling phase, a periodic dynamics is rapidly approached, with all neurons firing with the same rate and mutually phase-locked. The strong-coupling phase is characterized by an irregular pattern, even though the maximum Lyapunov exponent is negative. The paradox is solved by drawing an analogy with the phenomenon of ``stable chaos'', i.e. by observing that the stochastic-like behaviour is "limited" to a an exponentially long (with the system size) transient. Remarkably, the transient dynamics turns out to be stationary.Comment: 11 pages, 13 figures, submitted to Phys. Rev.

    Event-driven simulations of a plastic, spiking neural network

    Full text link
    We consider a fully-connected network of leaky integrate-and-fire neurons with spike-timing-dependent plasticity. The plasticity is controlled by a parameter representing the expected weight of a synapse between neurons that are firing randomly with the same mean frequency. For low values of the plasticity parameter, the activities of the system are dominated by noise, while large values of the plasticity parameter lead to self-sustaining activity in the network. We perform event-driven simulations on finite-size networks with up to 128 neurons to find the stationary synaptic weight conformations for different values of the plasticity parameter. In both the low and high activity regimes, the synaptic weights are narrowly distributed around the plasticity parameter value consistent with the predictions of mean-field theory. However, the distribution broadens in the transition region between the two regimes, representing emergent network structures. Using a pseudophysical approach for visualization, we show that the emergent structures are of "path" or "hub" type, observed at different values of the plasticity parameter in the transition region.Comment: 9 pages, 6 figure

    Crossover between Levy and Gaussian regimes in first passage processes

    Get PDF
    We propose a new approach to the problem of the first passage time. Our method is applicable not only to the Wiener process but also to the non--Gaussian Leˊ\acute{\rm e}vy flights or to more complicated stochastic processes whose distributions are stable. To show the usefulness of the method, we particularly focus on the first passage time problems in the truncated Leˊ\acute{\rm e}vy flights (the so-called KoBoL processes), in which the arbitrarily large tail of the Leˊ\acute{\rm e}vy distribution is cut off. We find that the asymptotic scaling law of the first passage time tt distribution changes from t(α+1)/αt^{-(\alpha +1)/\alpha}-law (non-Gaussian Leˊ\acute{\rm e}vy regime) to t3/2t^{-3/2}-law (Gaussian regime) at the crossover point. This result means that an ultra-slow convergence from the non-Gaussian Leˊ\acute{\rm e}vy regime to the Gaussian regime is observed not only in the distribution of the real time step for the truncated Leˊ\acute{\rm e}vy flight but also in the first passage time distribution of the flight. The nature of the crossover in the scaling laws and the scaling relation on the crossover point with respect to the effective cut-off length of the Leˊ\acute{\rm e}vy distribution are discussed.Comment: 18pages, 7figures, using revtex4, to appear in Phys.Rev.

    Adaptation Reduces Variability of the Neuronal Population Code

    Full text link
    Sequences of events in noise-driven excitable systems with slow variables often show serial correlations among their intervals of events. Here, we employ a master equation for general non-renewal processes to calculate the interval and count statistics of superimposed processes governed by a slow adaptation variable. For an ensemble of spike-frequency adapting neurons this results in the regularization of the population activity and an enhanced post-synaptic signal decoding. We confirm our theoretical results in a population of cortical neurons.Comment: 4 pages, 2 figure

    Noise Induced Coherence in Neural Networks

    Full text link
    We investigate numerically the dynamics of large networks of NN globally pulse-coupled integrate and fire neurons in a noise-induced synchronized state. The powerspectrum of an individual element within the network is shown to exhibit in the thermodynamic limit (NN\to \infty) a broadband peak and an additional delta-function peak that is absent from the powerspectrum of an isolated element. The powerspectrum of the mean output signal only exhibits the delta-function peak. These results are explained analytically in an exactly soluble oscillator model with global phase coupling.Comment: 4 pages ReVTeX and 3 postscript figure

    Comparison of Langevin and Markov channel noise models for neuronal signal generation

    Full text link
    The stochastic opening and closing of voltage-gated ion channels produces noise in neurons. The effect of this noise on the neuronal performance has been modelled using either approximate or Langevin model, based on stochastic differential equations or an exact model, based on a Markov process model of channel gating. Yet whether the Langevin model accurately reproduces the channel noise produced by the Markov model remains unclear. Here we present a comparison between Langevin and Markov models of channel noise in neurons using single compartment Hodgkin-Huxley models containing either Na+Na^{+} and K+K^{+}, or only K+K^{+} voltage-gated ion channels. The performance of the Langevin and Markov models was quantified over a range of stimulus statistics, membrane areas and channel numbers. We find that in comparison to the Markov model, the Langevin model underestimates the noise contributed by voltage-gated ion channels, overestimating information rates for both spiking and non-spiking membranes. Even with increasing numbers of channels the difference between the two models persists. This suggests that the Langevin model may not be suitable for accurately simulating channel noise in neurons, even in simulations with large numbers of ion channels
    corecore