2,377 research outputs found

    Limits and dynamics of stochastic neuronal networks with random heterogeneous delays

    Full text link
    Realistic networks display heterogeneous transmission delays. We analyze here the limits of large stochastic multi-populations networks with stochastic coupling and random interconnection delays. We show that depending on the nature of the delays distributions, a quenched or averaged propagation of chaos takes place in these networks, and that the network equations converge towards a delayed McKean-Vlasov equation with distributed delays. Our approach is mostly fitted to neuroscience applications. We instantiate in particular a classical neuronal model, the Wilson and Cowan system, and show that the obtained limit equations have Gaussian solutions whose mean and standard deviation satisfy a closed set of coupled delay differential equations in which the distribution of delays and the noise levels appear as parameters. This allows to uncover precisely the effects of noise, delays and coupling on the dynamics of such heterogeneous networks, in particular their role in the emergence of synchronized oscillations. We show in several examples that not only the averaged delay, but also the dispersion, govern the dynamics of such networks.Comment: Corrected misprint (useless stopping time) in proof of Lemma 1 and clarified a regularity hypothesis (remark 1

    Delay-induced patterns in a two-dimensional lattice of coupled oscillators

    Get PDF
    We show how a variety of stable spatio-temporal periodic patterns can be created in 2D-lattices of coupled oscillators with non-homogeneous coupling delays. A "hybrid dispersion relation" is introduced, which allows studying the stability of time-periodic patterns analytically in the limit of large delay. The results are illustrated using the FitzHugh-Nagumo coupled neurons as well as coupled limit cycle (Stuart-Landau) oscillators

    Mean-field equations for stochastic firing-rate neural fields with delays: Derivation and noise-induced transitions

    Full text link
    In this manuscript we analyze the collective behavior of mean-field limits of large-scale, spatially extended stochastic neuronal networks with delays. Rigorously, the asymptotic regime of such systems is characterized by a very intricate stochastic delayed integro-differential McKean-Vlasov equation that remain impenetrable, leaving the stochastic collective dynamics of such networks poorly understood. In order to study these macroscopic dynamics, we analyze networks of firing-rate neurons, i.e. with linear intrinsic dynamics and sigmoidal interactions. In that case, we prove that the solution of the mean-field equation is Gaussian, hence characterized by its two first moments, and that these two quantities satisfy a set of coupled delayed integro-differential equations. These equations are similar to usual neural field equations, and incorporate noise levels as a parameter, allowing analysis of noise-induced transitions. We identify through bifurcation analysis several qualitative transitions due to noise in the mean-field limit. In particular, stabilization of spatially homogeneous solutions, synchronized oscillations, bumps, chaotic dynamics, wave or bump splitting are exhibited and arise from static or dynamic Turing-Hopf bifurcations. These surprising phenomena allow further exploring the role of noise in the nervous system.Comment: Updated to the latest version published, and clarified the dependence in space of Brownian motion

    A unified view on weakly correlated recurrent networks

    Get PDF
    The diversity of neuron models used in contemporary theoretical neuroscience to investigate specific properties of covariances raises the question how these models relate to each other. In particular it is hard to distinguish between generic properties and peculiarities due to the abstracted model. Here we present a unified view on pairwise covariances in recurrent networks in the irregular regime. We consider the binary neuron model, the leaky integrate-and-fire model, and the Hawkes process. We show that linear approximation maps each of these models to either of two classes of linear rate models, including the Ornstein-Uhlenbeck process as a special case. The classes differ in the location of additive noise in the rate dynamics, which is on the output side for spiking models and on the input side for the binary model. Both classes allow closed form solutions for the covariance. For output noise it separates into an echo term and a term due to correlated input. The unified framework enables us to transfer results between models. For example, we generalize the binary model and the Hawkes process to the presence of conduction delays and simplify derivations for established results. Our approach is applicable to general network structures and suitable for population averages. The derived averages are exact for fixed out-degree network architectures and approximate for fixed in-degree. We demonstrate how taking into account fluctuations in the linearization procedure increases the accuracy of the effective theory and we explain the class dependent differences between covariances in the time and the frequency domain. Finally we show that the oscillatory instability emerging in networks of integrate-and-fire models with delayed inhibitory feedback is a model-invariant feature: the same structure of poles in the complex frequency plane determines the population power spectra

    Complex partial synchronization patterns in networks of delay-coupled neurons

    Get PDF
    We study the spatio-temporal dynamics of a multiplex network of delay-coupled FitzHugh–Nagumo oscillators with non-local and fractal connectivities. Apart from chimera states, a new regime of coexistence of slow and fast oscillations is found. An analytical explanation for the emergence of such coexisting partial synchronization patterns is given. Furthermore, we propose a control scheme for the number of fast and slow neurons in each layer.DFG, 163436311, SFB 910: Kontrolle selbstorganisierender nichtlinearer Systeme: Theoretische Methoden und Anwendungskonzept

    Conditions for wave trains in spiking neural networks

    Full text link
    Spatiotemporal patterns such as traveling waves are frequently observed in recordings of neural activity. The mechanisms underlying the generation of such patterns are largely unknown. Previous studies have investigated the existence and uniqueness of different types of waves or bumps of activity using neural-field models, phenomenological coarse-grained descriptions of neural-network dynamics. But it remains unclear how these insights can be transferred to more biologically realistic networks of spiking neurons, where individual neurons fire irregularly. Here, we employ mean-field theory to reduce a microscopic model of leaky integrate-and-fire (LIF) neurons with distance-dependent connectivity to an effective neural-field model. In contrast to existing phenomenological descriptions, the dynamics in this neural-field model depends on the mean and the variance in the synaptic input, both determining the amplitude and the temporal structure of the resulting effective coupling kernel. For the neural-field model we employ liner stability analysis to derive conditions for the existence of spatial and temporal oscillations and wave trains, that is, temporally and spatially periodic traveling waves. We first prove that wave trains cannot occur in a single homogeneous population of neurons, irrespective of the form of distance dependence of the connection probability. Compatible with the architecture of cortical neural networks, wave trains emerge in two-population networks of excitatory and inhibitory neurons as a combination of delay-induced temporal oscillations and spatial oscillations due to distance-dependent connectivity profiles. Finally, we demonstrate quantitative agreement between predictions of the analytically tractable neural-field model and numerical simulations of both networks of nonlinear rate-based units and networks of LIF neurons.Comment: 36 pages, 8 figures, 4 table

    Spike frequency adaptation affects the synchronization properties of networks of cortical oscillators

    Get PDF
    Oscillations in many regions of the cortex have common temporal characteristics with dominant frequencies centered around the 40 Hz (gamma) frequency range and the 5–10 Hz (theta) frequency range. Experimental results also reveal spatially synchronous oscillations, which are stimulus dependent (Gray&Singer, 1987;Gray, König, Engel, & Singer, 1989; Engel, König, Kreiter, Schillen, & Singer, 1992). This rhythmic activity suggests that the coherence of neural populations is a crucial feature of cortical dynamics (Gray, 1994). Using both simulations and a theoretical coupled oscillator approach, we demonstrate that the spike frequency adaptation seen in many pyramidal cells plays a subtle but important role in the dynamics of cortical networks. Without adaptation, excitatory connections among model pyramidal cells are desynchronizing. However, the slow processes associated with adaptation encourage stable synchronous behavior

    Computational study of resting state network dynamics

    Get PDF
    Lo scopo di questa tesi è quello di mostrare, attraverso una simulazione con il software The Virtual Brain, le più importanti proprietà della dinamica cerebrale durante il resting state, ovvero quando non si è coinvolti in nessun compito preciso e non si è sottoposti a nessuno stimolo particolare. Si comincia con lo spiegare cos’è il resting state attraverso una breve revisione storica della sua scoperta, quindi si passano in rassegna alcuni metodi sperimentali utilizzati nell’analisi dell’attività cerebrale, per poi evidenziare la differenza tra connettività strutturale e funzionale. In seguito, si riassumono brevemente i concetti dei sistemi dinamici, teoria indispensabile per capire un sistema complesso come il cervello. Nel capitolo successivo, attraverso un approccio ‘bottom-up’, si illustrano sotto il profilo biologico le principali strutture del sistema nervoso, dal neurone alla corteccia cerebrale. Tutto ciò viene spiegato anche dal punto di vista dei sistemi dinamici, illustrando il pionieristico modello di Hodgkin-Huxley e poi il concetto di dinamica di popolazione. Dopo questa prima parte preliminare si entra nel dettaglio della simulazione. Prima di tutto si danno maggiori informazioni sul software The Virtual Brain, si definisce il modello di network del resting state utilizzato nella simulazione e si descrive il ‘connettoma’ adoperato. Successivamente vengono mostrati i risultati dell’analisi svolta sui dati ricavati, dai quali si mostra come la criticità e il rumore svolgano un ruolo chiave nell'emergenza di questa attività di fondo del cervello. Questi risultati vengono poi confrontati con le più importanti e recenti ricerche in questo ambito, le quali confermano i risultati del nostro lavoro. Infine, si riportano brevemente le conseguenze che porterebbe in campo medico e clinico una piena comprensione del fenomeno del resting state e la possibilità di virtualizzare l’attività cerebrale

    The complexity of dynamics in small neural circuits

    Full text link
    Mean-field theory is a powerful tool for studying large neural networks. However, when the system is composed of a few neurons, macroscopic differences between the mean-field approximation and the real behavior of the network can arise. Here we introduce a study of the dynamics of a small firing-rate network with excitatory and inhibitory populations, in terms of local and global bifurcations of the neural activity. Our approach is analytically tractable in many respects, and sheds new light on the finite-size effects of the system. In particular, we focus on the formation of multiple branching solutions of the neural equations through spontaneous symmetry-breaking, since this phenomenon increases considerably the complexity of the dynamical behavior of the network. For these reasons, branching points may reveal important mechanisms through which neurons interact and process information, which are not accounted for by the mean-field approximation.Comment: 34 pages, 11 figures. Supplementary materials added, colors of figures 8 and 9 fixed, results unchange
    corecore