22,205 research outputs found

    Finite-size and correlation-induced effects in Mean-field Dynamics

    Full text link
    The brain's activity is characterized by the interaction of a very large number of neurons that are strongly affected by noise. However, signals often arise at macroscopic scales integrating the effect of many neurons into a reliable pattern of activity. In order to study such large neuronal assemblies, one is often led to derive mean-field limits summarizing the effect of the interaction of a large number of neurons into an effective signal. Classical mean-field approaches consider the evolution of a deterministic variable, the mean activity, thus neglecting the stochastic nature of neural behavior. In this article, we build upon two recent approaches that include correlations and higher order moments in mean-field equations, and study how these stochastic effects influence the solutions of the mean-field equations, both in the limit of an infinite number of neurons and for large yet finite networks. We introduce a new model, the infinite model, which arises from both equations by a rescaling of the variables and, which is invertible for finite-size networks, and hence, provides equivalent equations to those previously derived models. The study of this model allows us to understand qualitative behavior of such large-scale networks. We show that, though the solutions of the deterministic mean-field equation constitute uncorrelated solutions of the new mean-field equations, the stability properties of limit cycles are modified by the presence of correlations, and additional non-trivial behaviors including periodic orbits appear when there were none in the mean field. The origin of all these behaviors is then explored in finite-size networks where interesting mesoscopic scale effects appear. This study leads us to show that the infinite-size system appears as a singular limit of the network equations, and for any finite network, the system will differ from the infinite system

    Noise-induced behaviors in neural mean field dynamics

    Full text link
    The collective behavior of cortical neurons is strongly affected by the presence of noise at the level of individual cells. In order to study these phenomena in large-scale assemblies of neurons, we consider networks of firing-rate neurons with linear intrinsic dynamics and nonlinear coupling, belonging to a few types of cell populations and receiving noisy currents. Asymptotic equations as the number of neurons tends to infinity (mean field equations) are rigorously derived based on a probabilistic approach. These equations are implicit on the probability distribution of the solutions which generally makes their direct analysis difficult. However, in our case, the solutions are Gaussian, and their moments satisfy a closed system of nonlinear ordinary differential equations (ODEs), which are much easier to study than the original stochastic network equations, and the statistics of the empirical process uniformly converge towards the solutions of these ODEs. Based on this description, we analytically and numerically study the influence of noise on the collective behaviors, and compare these asymptotic regimes to simulations of the network. We observe that the mean field equations provide an accurate description of the solutions of the network equations for network sizes as small as a few hundreds of neurons. In particular, we observe that the level of noise in the system qualitatively modifies its collective behavior, producing for instance synchronized oscillations of the whole network, desynchronization of oscillating regimes, and stabilization or destabilization of stationary solutions. These results shed a new light on the role of noise in shaping collective dynamics of neurons, and gives us clues for understanding similar phenomena observed in biological networks

    Limits and dynamics of stochastic neuronal networks with random heterogeneous delays

    Full text link
    Realistic networks display heterogeneous transmission delays. We analyze here the limits of large stochastic multi-populations networks with stochastic coupling and random interconnection delays. We show that depending on the nature of the delays distributions, a quenched or averaged propagation of chaos takes place in these networks, and that the network equations converge towards a delayed McKean-Vlasov equation with distributed delays. Our approach is mostly fitted to neuroscience applications. We instantiate in particular a classical neuronal model, the Wilson and Cowan system, and show that the obtained limit equations have Gaussian solutions whose mean and standard deviation satisfy a closed set of coupled delay differential equations in which the distribution of delays and the noise levels appear as parameters. This allows to uncover precisely the effects of noise, delays and coupling on the dynamics of such heterogeneous networks, in particular their role in the emergence of synchronized oscillations. We show in several examples that not only the averaged delay, but also the dispersion, govern the dynamics of such networks.Comment: Corrected misprint (useless stopping time) in proof of Lemma 1 and clarified a regularity hypothesis (remark 1

    Propagation of chaos in neural fields

    Full text link
    We consider the problem of the limit of bio-inspired spatially extended neuronal networks including an infinite number of neuronal types (space locations), with space-dependent propagation delays modeling neural fields. The propagation of chaos property is proved in this setting under mild assumptions on the neuronal dynamics, valid for most models used in neuroscience, in a mesoscopic limit, the neural-field limit, in which we can resolve the quite fine structure of the neuron's activity in space and where averaging effects occur. The mean-field equations obtained are of a new type: they take the form of well-posed infinite-dimensional delayed integro-differential equations with a nonlocal mean-field term and a singular spatio-temporal Brownian motion. We also show how these intricate equations can be used in practice to uncover mathematically the precise mesoscopic dynamics of the neural field in a particular model where the mean-field equations exactly reduce to deterministic nonlinear delayed integro-differential equations. These results have several theoretical implications in neuroscience we review in the discussion.Comment: Updated to correct an erroneous suggestion of extension of the results in Appendix B, and to clarify some measurability questions in the proof of Theorem

    A constructive mean field analysis of multi population neural networks with random synaptic weights and stochastic inputs

    Get PDF
    We deal with the problem of bridging the gap between two scales in neuronal modeling. At the first (microscopic) scale, neurons are considered individually and their behavior described by stochastic differential equations that govern the time variations of their membrane potentials. They are coupled by synaptic connections acting on their resulting activity, a nonlinear function of their membrane potential. At the second (mesoscopic) scale, interacting populations of neurons are described individually by similar equations. The equations describing the dynamical and the stationary mean field behaviors are considered as functional equations on a set of stochastic processes. Using this new point of view allows us to prove that these equations are well-posed on any finite time interval and to provide a constructive method for effectively computing their unique solution. This method is proved to converge to the unique solution and we characterize its complexity and convergence rate. We also provide partial results for the stationary problem on infinite time intervals. These results shed some new light on such neural mass models as the one of Jansen and Rit \cite{jansen-rit:95}: their dynamics appears as a coarse approximation of the much richer dynamics that emerges from our analysis. Our numerical experiments confirm that the framework we propose and the numerical methods we derive from it provide a new and powerful tool for the exploration of neural behaviors at different scales.Comment: 55 pages, 4 figures, to appear in "Frontiers in Neuroscience

    Mean Field description of and propagation of chaos in recurrent multipopulation networks of Hodgkin-Huxley and Fitzhugh-Nagumo neurons

    Full text link
    We derive the mean-field equations arising as the limit of a network of interacting spiking neurons, as the number of neurons goes to infinity. The neurons belong to a fixed number of populations and are represented either by the Hodgkin-Huxley model or by one of its simplified version, the Fitzhugh-Nagumo model. The synapses between neurons are either electrical or chemical. The network is assumed to be fully connected. The maximum conductances vary randomly. Under the condition that all neurons initial conditions are drawn independently from the same law that depends only on the population they belong to, we prove that a propagation of chaos phenomenon takes places, namely that in the mean-field limit, any finite number of neurons become independent and, within each population, have the same probability distribution. This probability distribution is solution of a set of implicit equations, either nonlinear stochastic differential equations resembling the McKean-Vlasov equations, or non-local partial differential equations resembling the McKean-Vlasov-Fokker- Planck equations. We prove the well-posedness of these equations, i.e. the existence and uniqueness of a solution. We also show the results of some preliminary numerical experiments that indicate that the mean-field equations are a good representation of the mean activity of a finite size network, even for modest sizes. These experiment also indicate that the McKean-Vlasov-Fokker- Planck equations may be a good way to understand the mean-field dynamics through, e.g., a bifurcation analysis.Comment: 55 pages, 9 figure
    corecore