1,397 research outputs found

    Signal buffering in random networks of spiking neurons: microscopic vs. macroscopic phenomena

    Get PDF
    In randomly connected networks of pulse-coupled elements a time-dependent input signal can be buffered over a short time. We studied the signal buffering properties in simulated networks as a function of the networks state, characterized by both the Lyapunov exponent of the microscopic dynamics and the macroscopic activity derived from mean-field theory. If all network elements receive the same signal, signal buffering over delays comparable to the intrinsic time constant of the network elements can be explained by macroscopic properties and works best at the phase transition to chaos. However, if only 20 percent of the network units receive a common time-dependent signal, signal buffering properties improve and can no longer be attributed to the macroscopic dynamics.Comment: 5 pages, 3 figure

    Nonnormal amplification in random balanced neuronal networks

    Get PDF
    In dynamical models of cortical networks, the recurrent connectivity can amplify the input given to the network in two distinct ways. One is induced by the presence of near-critical eigenvalues in the connectivity matrix W, producing large but slow activity fluctuations along the corresponding eigenvectors (dynamical slowing). The other relies on W being nonnormal, which allows the network activity to make large but fast excursions along specific directions. Here we investigate the tradeoff between nonnormal amplification and dynamical slowing in the spontaneous activity of large random neuronal networks composed of excitatory and inhibitory neurons. We use a Schur decomposition of W to separate the two amplification mechanisms. Assuming linear stochastic dynamics, we derive an exact expression for the expected amount of purely nonnormal amplification. We find that amplification is very limited if dynamical slowing must be kept weak. We conclude that, to achieve strong transient amplification with little slowing, the connectivity must be structured. We show that unidirectional connections between neurons of the same type together with reciprocal connections between neurons of different types, allow for amplification already in the fast dynamical regime. Finally, our results also shed light on the differences between balanced networks in which inhibition exactly cancels excitation, and those where inhibition dominates.Comment: 13 pages, 7 figure

    Extracting non-linear integrate-and-fire models from experimental data using dynamic I–V curves

    Get PDF
    The dynamic I–V curve method was recently introduced for the efficient experimental generation of reduced neuron models. The method extracts the response properties of a neuron while it is subject to a naturalistic stimulus that mimics in vivo-like fluctuating synaptic drive. The resulting history-dependent, transmembrane current is then projected onto a one-dimensional current–voltage relation that provides the basis for a tractable non-linear integrate-and-fire model. An attractive feature of the method is that it can be used in spike-triggered mode to quantify the distinct patterns of post-spike refractoriness seen in different classes of cortical neuron. The method is first illustrated using a conductance-based model and is then applied experimentally to generate reduced models of cortical layer-5 pyramidal cells and interneurons, in injected-current and injected- conductance protocols. The resulting low-dimensional neuron models—of the refractory exponential integrate-and-fire type—provide highly accurate predictions for spike-times. The method therefore provides a useful tool for the construction of tractable models and rapid experimental classification of cortical neurons

    Desynchronization in diluted neural networks

    Full text link
    The dynamical behaviour of a weakly diluted fully-inhibitory network of pulse-coupled spiking neurons is investigated. Upon increasing the coupling strength, a transition from regular to stochastic-like regime is observed. In the weak-coupling phase, a periodic dynamics is rapidly approached, with all neurons firing with the same rate and mutually phase-locked. The strong-coupling phase is characterized by an irregular pattern, even though the maximum Lyapunov exponent is negative. The paradox is solved by drawing an analogy with the phenomenon of ``stable chaos'', i.e. by observing that the stochastic-like behaviour is "limited" to a an exponentially long (with the system size) transient. Remarkably, the transient dynamics turns out to be stationary.Comment: 11 pages, 13 figures, submitted to Phys. Rev.

    Phase-locking in weakly heterogeneous neuronal networks

    Full text link
    We examine analytically the existence and stability of phase-locked states in a weakly heterogeneous neuronal network. We consider a model of N neurons with all-to-all synaptic coupling where the heterogeneity is in the firing frequency or intrinsic drive of the neurons. We consider both inhibitory and excitatory coupling. We derive the conditions under which stable phase-locking is possible. In homogeneous networks, many different periodic phase-locked states are possible. Their stability depends on the dynamics of the neuron and the coupling. For weak heterogeneity, the phase-locked states are perturbed from the homogeneous states and can remain stable if their homogeneous conterparts are stable. For enough heterogeneity, phase-locked solutions either lose stability or are destroyed completely. We analyze the possible states the network can take when phase-locking is broken.Comment: RevTex, 27 pages, 3 figure

    Event-driven simulations of a plastic, spiking neural network

    Full text link
    We consider a fully-connected network of leaky integrate-and-fire neurons with spike-timing-dependent plasticity. The plasticity is controlled by a parameter representing the expected weight of a synapse between neurons that are firing randomly with the same mean frequency. For low values of the plasticity parameter, the activities of the system are dominated by noise, while large values of the plasticity parameter lead to self-sustaining activity in the network. We perform event-driven simulations on finite-size networks with up to 128 neurons to find the stationary synaptic weight conformations for different values of the plasticity parameter. In both the low and high activity regimes, the synaptic weights are narrowly distributed around the plasticity parameter value consistent with the predictions of mean-field theory. However, the distribution broadens in the transition region between the two regimes, representing emergent network structures. Using a pseudophysical approach for visualization, we show that the emergent structures are of "path" or "hub" type, observed at different values of the plasticity parameter in the transition region.Comment: 9 pages, 6 figure

    Noise Induced Coherence in Neural Networks

    Full text link
    We investigate numerically the dynamics of large networks of NN globally pulse-coupled integrate and fire neurons in a noise-induced synchronized state. The powerspectrum of an individual element within the network is shown to exhibit in the thermodynamic limit (NN\to \infty) a broadband peak and an additional delta-function peak that is absent from the powerspectrum of an isolated element. The powerspectrum of the mean output signal only exhibits the delta-function peak. These results are explained analytically in an exactly soluble oscillator model with global phase coupling.Comment: 4 pages ReVTeX and 3 postscript figure

    Adaptation Reduces Variability of the Neuronal Population Code

    Full text link
    Sequences of events in noise-driven excitable systems with slow variables often show serial correlations among their intervals of events. Here, we employ a master equation for general non-renewal processes to calculate the interval and count statistics of superimposed processes governed by a slow adaptation variable. For an ensemble of spike-frequency adapting neurons this results in the regularization of the population activity and an enhanced post-synaptic signal decoding. We confirm our theoretical results in a population of cortical neurons.Comment: 4 pages, 2 figure

    Dynamical response of the Hodgkin-Huxley model in the high-input regime

    Full text link
    The response of the Hodgkin-Huxley neuronal model subjected to stochastic uncorrelated spike trains originating from a large number of inhibitory and excitatory post-synaptic potentials is analyzed in detail. The model is examined in its three fundamental dynamical regimes: silence, bistability and repetitive firing. Its response is characterized in terms of statistical indicators (interspike-interval distributions and their first moments) as well as of dynamical indicators (autocorrelation functions and conditional entropies). In the silent regime, the coexistence of two different coherence resonances is revealed: one occurs at quite low noise and is related to the stimulation of subthreshold oscillations around the rest state; the second one (at intermediate noise variance) is associated with the regularization of the sequence of spikes emitted by the neuron. Bistability in the low noise limit can be interpreted in terms of jumping processes across barriers activated by stochastic fluctuations. In the repetitive firing regime a maximization of incoherence is observed at finite noise variance. Finally, the mechanisms responsible for spike triggering in the various regimes are clearly identified.Comment: 14 pages, 24 figures in eps, submitted to Physical Review
    corecore