27,729 research outputs found

    Random Recurrent Neural Networks Dynamics

    Full text link
    This paper is a review dealing with the study of large size random recurrent neural networks. The connection weights are selected according to a probability law and it is possible to predict the network dynamics at a macroscopic scale using an averaging principle. After a first introductory section, the section 1 reviews the various models from the points of view of the single neuron dynamics and of the global network dynamics. A summary of notations is presented, which is quite helpful for the sequel. In section 2, mean-field dynamics is developed. The probability distribution characterizing global dynamics is computed. In section 3, some applications of mean-field theory to the prediction of chaotic regime for Analog Formal Random Recurrent Neural Networks (AFRRNN) are displayed. The case of AFRRNN with an homogeneous population of neurons is studied in section 4. Then, a two-population model is studied in section 5. The occurrence of a cyclo-stationary chaos is displayed using the results of \cite{Dauce01}. In section 6, an insight of the application of mean-field theory to IF networks is given using the results of \cite{BrunelHakim99}.Comment: Review paper, 36 pages, 5 figure

    Limits and dynamics of randomly connected neuronal networks

    Full text link
    Networks of the brain are composed of a very large number of neurons connected through a random graph and interacting after random delays that both depend on the anatomical distance between cells. In order to comprehend the role of these random architectures on the dynamics of such networks, we analyze the mesoscopic and macroscopic limits of networks with random correlated connectivity weights and delays. We address both averaged and quenched limits, and show propagation of chaos and convergence to a complex integral McKean-Vlasov equations with distributed delays. We then instantiate a completely solvable model illustrating the role of such random architectures in the emerging macroscopic activity. We particularly focus on the role of connectivity levels in the emergence of periodic solutions

    Propagation of chaos in neural fields

    Full text link
    We consider the problem of the limit of bio-inspired spatially extended neuronal networks including an infinite number of neuronal types (space locations), with space-dependent propagation delays modeling neural fields. The propagation of chaos property is proved in this setting under mild assumptions on the neuronal dynamics, valid for most models used in neuroscience, in a mesoscopic limit, the neural-field limit, in which we can resolve the quite fine structure of the neuron's activity in space and where averaging effects occur. The mean-field equations obtained are of a new type: they take the form of well-posed infinite-dimensional delayed integro-differential equations with a nonlocal mean-field term and a singular spatio-temporal Brownian motion. We also show how these intricate equations can be used in practice to uncover mathematically the precise mesoscopic dynamics of the neural field in a particular model where the mean-field equations exactly reduce to deterministic nonlinear delayed integro-differential equations. These results have several theoretical implications in neuroscience we review in the discussion.Comment: Updated to correct an erroneous suggestion of extension of the results in Appendix B, and to clarify some measurability questions in the proof of Theorem

    Limits and dynamics of stochastic neuronal networks with random heterogeneous delays

    Full text link
    Realistic networks display heterogeneous transmission delays. We analyze here the limits of large stochastic multi-populations networks with stochastic coupling and random interconnection delays. We show that depending on the nature of the delays distributions, a quenched or averaged propagation of chaos takes place in these networks, and that the network equations converge towards a delayed McKean-Vlasov equation with distributed delays. Our approach is mostly fitted to neuroscience applications. We instantiate in particular a classical neuronal model, the Wilson and Cowan system, and show that the obtained limit equations have Gaussian solutions whose mean and standard deviation satisfy a closed set of coupled delay differential equations in which the distribution of delays and the noise levels appear as parameters. This allows to uncover precisely the effects of noise, delays and coupling on the dynamics of such heterogeneous networks, in particular their role in the emergence of synchronized oscillations. We show in several examples that not only the averaged delay, but also the dispersion, govern the dynamics of such networks.Comment: Corrected misprint (useless stopping time) in proof of Lemma 1 and clarified a regularity hypothesis (remark 1

    Chimera states in pulse coupled neural networks: the influence of dilution and noise

    Get PDF
    We analyse the possible dynamical states emerging for two symmetrically pulse coupled populations of leaky integrate-and-fire neurons. In particular, we observe broken symmetry states in this set-up: namely, breathing chimeras, where one population is fully synchronized and the other is in a state of partial synchronization (PS) as well as generalized chimera states, where both populations are in PS, but with different levels of synchronization. Symmetric macroscopic states are also present, ranging from quasi-periodic motions, to collective chaos, from splay states to population anti-phase partial synchronization. We then investigate the influence disorder, random link removal or noise, on the dynamics of collective solutions in this model. As a result, we observe that broken symmetry chimera-like states, with both populations partially synchronized, persist up to 80 \% of broken links and up to noise amplitudes 8 \% of threshold-reset distance. Furthermore, the introduction of disorder on symmetric chaotic state has a constructive effect, namely to induce the emergence of chimera-like states at intermediate dilution or noise level.Comment: 15 pages, 7 figure, contribution for the Workshop "Nonlinear Dynamics in Computational Neuroscience: from Physics and Biology to ICT" held in Turin (Italy) in September 201

    Transient Information Flow in a Network of Excitatory and Inhibitory Model Neurons: Role of Noise and Signal Autocorrelation

    Get PDF
    We investigate the performance of sparsely-connected networks of integrate-and-fire neurons for ultra-short term information processing. We exploit the fact that the population activity of networks with balanced excitation and inhibition can switch from an oscillatory firing regime to a state of asynchronous irregular firing or quiescence depending on the rate of external background spikes. We find that in terms of information buffering the network performs best for a moderate, non-zero, amount of noise. Analogous to the phenomenon of stochastic resonance the performance decreases for higher and lower noise levels. The optimal amount of noise corresponds to the transition zone between a quiescent state and a regime of stochastic dynamics. This provides a potential explanation on the role of non-oscillatory population activity in a simplified model of cortical micro-circuits.Comment: 27 pages, 7 figures, to appear in J. Physiology (Paris) Vol. 9

    Stochastic firing rate models

    Full text link
    We review a recent approach to the mean-field limits in neural networks that takes into account the stochastic nature of input current and the uncertainty in synaptic coupling. This approach was proved to be a rigorous limit of the network equations in a general setting, and we express here the results in a more customary and simpler framework. We propose a heuristic argument to derive these equations providing a more intuitive understanding of their origin. These equations are characterized by a strong coupling between the different moments of the solutions. We analyse the equations, present an algorithm to simulate the solutions of these mean-field equations, and investigate numerically the equations. In particular, we build a bridge between these equations and Sompolinsky and collaborators approach (1988, 1990), and show how the coupling between the mean and the covariance function deviates from customary approaches

    Average activity of excitatory and inhibitory neural populations

    Get PDF
    We develop an extension of the Ott-Antonsen method [E. Ott and T. M. Antonsen, Chaos 18(3), 037113 (2008)] that allows obtaining the mean activity (spiking rate) of a population of excitable units. By means of the Ott-Antonsen method, equations for the dynamics of the order parameters of coupled excitatory and inhibitory populations of excitable units are obtained, and their mean activities are computed. Two different excitable systems are studied: Adler units and theta neurons. The resulting bifurcation diagrams are compared with those obtained from studying the phenomenological Wilson-Cowan model in some regions of the parameter space. Compatible behaviors, as well as higher dimensional chaotic solutions, are observed. We study numerical simulations to further validate the equations.Fil: Roulet, Javier. Consejo Nacional de Investigaciones CientĂ­ficas y TĂ©cnicas. Oficina de CoordinaciĂłn Administrativa Ciudad Universitaria. Instituto de FĂ­sica de Buenos Aires. Universidad de Buenos Aires. Facultad de Ciencias Exactas y Naturales. Instituto de FĂ­sica de Buenos Aires; ArgentinaFil: Mindlin, Bernardo Gabriel. Consejo Nacional de Investigaciones CientĂ­ficas y TĂ©cnicas. Oficina de CoordinaciĂłn Administrativa Ciudad Universitaria. Instituto de FĂ­sica de Buenos Aires. Universidad de Buenos Aires. Facultad de Ciencias Exactas y Naturales. Instituto de FĂ­sica de Buenos Aires; Argentin
    • …
    corecore