4,800 research outputs found

    Homeostatic plasticity and external input shape neural network dynamics

    Full text link
    In vitro and in vivo spiking activity clearly differ. Whereas networks in vitro develop strong bursts separated by periods of very little spiking activity, in vivo cortical networks show continuous activity. This is puzzling considering that both networks presumably share similar single-neuron dynamics and plasticity rules. We propose that the defining difference between in vitro and in vivo dynamics is the strength of external input. In vitro, networks are virtually isolated, whereas in vivo every brain area receives continuous input. We analyze a model of spiking neurons in which the input strength, mediated by spike rate homeostasis, determines the characteristics of the dynamical state. In more detail, our analytical and numerical results on various network topologies show consistently that under increasing input, homeostatic plasticity generates distinct dynamic states, from bursting, to close-to-critical, reverberating and irregular states. This implies that the dynamic state of a neural network is not fixed but can readily adapt to the input strengths. Indeed, our results match experimental spike recordings in vitro and in vivo: the in vitro bursting behavior is consistent with a state generated by very low network input (< 0.1%), whereas in vivo activity suggests that on the order of 1% recorded spikes are input-driven, resulting in reverberating dynamics. Importantly, this predicts that one can abolish the ubiquitous bursts of in vitro preparations, and instead impose dynamics comparable to in vivo activity by exposing the system to weak long-term stimulation, thereby opening new paths to establish an in vivo-like assay in vitro for basic as well as neurological studies.Comment: 14 pages, 8 figures, accepted at Phys. Rev.

    Integration of continuous-time dynamics in a spiking neural network simulator

    Full text link
    Contemporary modeling approaches to the dynamics of neural networks consider two main classes of models: biologically grounded spiking neurons and functionally inspired rate-based units. The unified simulation framework presented here supports the combination of the two for multi-scale modeling approaches, the quantitative validation of mean-field approaches by spiking network simulations, and an increase in reliability by usage of the same simulation code and the same network model specifications for both model classes. While most efficient spiking simulations rely on the communication of discrete events, rate models require time-continuous interactions between neurons. Exploiting the conceptual similarity to the inclusion of gap junctions in spiking network simulations, we arrive at a reference implementation of instantaneous and delayed interactions between rate-based models in a spiking network simulator. The separation of rate dynamics from the general connection and communication infrastructure ensures flexibility of the framework. We further demonstrate the broad applicability of the framework by considering various examples from the literature ranging from random networks to neural field models. The study provides the prerequisite for interactions between rate-based and spiking models in a joint simulation

    Efficiency characterization of a large neuronal network: a causal information approach

    Get PDF
    When inhibitory neurons constitute about 40% of neurons they could have an important antinociceptive role, as they would easily regulate the level of activity of other neurons. We consider a simple network of cortical spiking neurons with axonal conduction delays and spike timing dependent plasticity, representative of a cortical column or hypercolumn with large proportion of inhibitory neurons. Each neuron fires following a Hodgkin-Huxley like dynamics and it is interconnected randomly to other neurons. The network dynamics is investigated estimating Bandt and Pompe probability distribution function associated to the interspike intervals and taking different degrees of inter-connectivity across neurons. More specifically we take into account the fine temporal ``structures'' of the complex neuronal signals not just by using the probability distributions associated to the inter spike intervals, but instead considering much more subtle measures accounting for their causal information: the Shannon permutation entropy, Fisher permutation information and permutation statistical complexity. This allows us to investigate how the information of the system might saturate to a finite value as the degree of inter-connectivity across neurons grows, inferring the emergent dynamical properties of the system.Comment: 26 pages, 3 Figures; Physica A, in pres

    Self-Organized Supercriticality and Oscillations in Networks of Stochastic Spiking Neurons

    Full text link
    Networks of stochastic spiking neurons are interesting models in the area of Theoretical Neuroscience, presenting both continuous and discontinuous phase transitions. Here we study fully connected networks analytically, numerically and by computational simulations. The neurons have dynamic gains that enable the network to converge to a stationary slightly supercritical state (self-organized supercriticality or SOSC) in the presence of the continuous transition. We show that SOSC, which presents power laws for neuronal avalanches plus some large events, is robust as a function of the main parameter of the neuronal gain dynamics. We discuss the possible applications of the idea of SOSC to biological phenomena like epilepsy and dragon king avalanches. We also find that neuronal gains can produce collective oscillations that coexists with neuronal avalanches, with frequencies compatible with characteristic brain rhythms.Comment: 16 pages, 16 figures divided into 7 figures in the articl

    Fluctuations and information filtering in coupled populations of spiking neurons with adaptation

    Get PDF
    Finite-sized populations of spiking elements are fundamental to brain function, but also used in many areas of physics. Here we present a theory of the dynamics of finite-sized populations of spiking units, based on a quasi-renewal description of neurons with adaptation. We derive an integral equation with colored noise that governs the stochastic dynamics of the population activity in response to time-dependent stimulation and calculate the spectral density in the asynchronous state. We show that systems of coupled populations with adaptation can generate a frequency band in which sensory information is preferentially encoded. The theory is applicable to fully as well as randomly connected networks, and to leaky integrate-and-fire as well as to generalized spiking neurons with adaptation on multiple time scales

    Entropy-based parametric estimation of spike train statistics

    Full text link
    We consider the evolution of a network of neurons, focusing on the asymptotic behavior of spikes dynamics instead of membrane potential dynamics. The spike response is not sought as a deterministic response in this context, but as a conditional probability : "Reading out the code" consists of inferring such a probability. This probability is computed from empirical raster plots, by using the framework of thermodynamic formalism in ergodic theory. This gives us a parametric statistical model where the probability has the form of a Gibbs distribution. In this respect, this approach generalizes the seminal and profound work of Schneidman and collaborators. A minimal presentation of the formalism is reviewed here, while a general algorithmic estimation method is proposed yielding fast convergent implementations. It is also made explicit how several spike observables (entropy, rate, synchronizations, correlations) are given in closed-form from the parametric estimation. This paradigm does not only allow us to estimate the spike statistics, given a design choice, but also to compare different models, thus answering comparative questions about the neural code such as : "are correlations (or time synchrony or a given set of spike patterns, ..) significant with respect to rate coding only ?" A numerical validation of the method is proposed and the perspectives regarding spike-train code analysis are also discussed.Comment: 37 pages, 8 figures, submitte
    • …
    corecore