78 research outputs found

    System-Size Effects on the Collective Dynamics of Cell Populations with Global Coupling

    Full text link
    Phase-transitionlike behavior is found to occur in globally coupled systems of finite number of elements, and its theoretical explanation is provided. The system studied is a population of globally pulse-coupled integrate-and-fire cells subject to small additive noise. As the population size is changed, the system shows a phase-transitionlike behavior. That is, there exits a well-defined critical system size above which the system stays in a monostable state with high-frequency activity while below which a new phase characterized by alternation of high- and low frequency activities appears. The mean field motion obeys a stochastic process with state-dependent noise, and the above phenomenon can be interpreted as a noise-induced transition characteristic to such processes. Coexistence of high- and low frequency activities observed in finite size systems is reported by N. Cohen, Y. Soen and E. Braun[Physica A249, 600 (1998)] in the experiments of cultivated heart cells. The present report gives the first qualitative interpretation of their experimental results

    Noise-Induced Synchronization and Clustering in Ensembles of Uncoupled Limit-Cycle Oscillators

    Get PDF
    We study synchronization properties of general uncoupled limit-cycle oscillators driven by common and independent Gaussian white noises. Using phase reduction and averaging methods, we analytically derive the stationary distribution of the phase difference between oscillators for weak noise intensity. We demonstrate that in addition to synchronization, clustering, or more generally coherence, always results from arbitrary initial conditions, irrespective of the details of the oscillators.Comment: 6 pages, 2 figure

    Finite-size and correlation-induced effects in Mean-field Dynamics

    Full text link
    The brain's activity is characterized by the interaction of a very large number of neurons that are strongly affected by noise. However, signals often arise at macroscopic scales integrating the effect of many neurons into a reliable pattern of activity. In order to study such large neuronal assemblies, one is often led to derive mean-field limits summarizing the effect of the interaction of a large number of neurons into an effective signal. Classical mean-field approaches consider the evolution of a deterministic variable, the mean activity, thus neglecting the stochastic nature of neural behavior. In this article, we build upon two recent approaches that include correlations and higher order moments in mean-field equations, and study how these stochastic effects influence the solutions of the mean-field equations, both in the limit of an infinite number of neurons and for large yet finite networks. We introduce a new model, the infinite model, which arises from both equations by a rescaling of the variables and, which is invertible for finite-size networks, and hence, provides equivalent equations to those previously derived models. The study of this model allows us to understand qualitative behavior of such large-scale networks. We show that, though the solutions of the deterministic mean-field equation constitute uncorrelated solutions of the new mean-field equations, the stability properties of limit cycles are modified by the presence of correlations, and additional non-trivial behaviors including periodic orbits appear when there were none in the mean field. The origin of all these behaviors is then explored in finite-size networks where interesting mesoscopic scale effects appear. This study leads us to show that the infinite-size system appears as a singular limit of the network equations, and for any finite network, the system will differ from the infinite system

    Uncertainty Principle for Control of Ensembles of Oscillators Driven by Common Noise

    Full text link
    We discuss control techniques for noisy self-sustained oscillators with a focus on reliability, stability of the response to noisy driving, and oscillation coherence understood in the sense of constancy of oscillation frequency. For any kind of linear feedback control--single and multiple delay feedback, linear frequency filter, etc.--the phase diffusion constant, quantifying coherence, and the Lyapunov exponent, quantifying reliability, can be efficiently controlled but their ratio remains constant. Thus, an "uncertainty principle" can be formulated: the loss of reliability occurs when coherence is enhanced and, vice versa, coherence is weakened when reliability is enhanced. Treatment of this principle for ensembles of oscillators synchronized by common noise or global coupling reveals a substantial difference between the cases of slightly non-identical oscillators and identical ones with intrinsic noise.Comment: 10 pages, 5 figure

    A Measurement of the D±D^{*\pm} Cross Section in Two-Photon Processes

    Get PDF
    We have measured the inclusive D±D^{*\pm} production cross section in a two-photon collision at the TRISTAN e+ee^+e^- collider. The mean s\sqrt{s} of the collider was 57.16 GeV and the integrated luminosity was 150 pb1pb^{-1}. The differential cross section (dσ(D±)/dPTd\sigma(D^{*\pm})/dP_T) was obtained in the PTP_T range between 1.6 and 6.6 GeV and compared with theoretical predictions, such as those involving direct and resolved photon processes.Comment: 8 pages, Latex format (article), figures corrected, published in Phys. Rev. D 50 (1994) 187

    Evaluation of the Performance of Information Theory-Based Methods and Cross-Correlation to Estimate the Functional Connectivity in Cortical Networks

    Get PDF
    Functional connectivity of in vitro neuronal networks was estimated by applying different statistical algorithms on data collected by Micro-Electrode Arrays (MEAs). First we tested these “connectivity methods” on neuronal network models at an increasing level of complexity and evaluated the performance in terms of ROC (Receiver Operating Characteristic) and PPC (Positive Precision Curve), a new defined complementary method specifically developed for functional links identification. Then, the algorithms better estimated the actual connectivity of the network models, were used to extract functional connectivity from cultured cortical networks coupled to MEAs. Among the proposed approaches, Transfer Entropy and Joint-Entropy showed the best results suggesting those methods as good candidates to extract functional links in actual neuronal networks from multi-site recordings

    From Spiking Neuron Models to Linear-Nonlinear Models

    Get PDF
    Neurons transform time-varying inputs into action potentials emitted stochastically at a time dependent rate. The mapping from current input to output firing rate is often represented with the help of phenomenological models such as the linear-nonlinear (LN) cascade, in which the output firing rate is estimated by applying to the input successively a linear temporal filter and a static non-linear transformation. These simplified models leave out the biophysical details of action potential generation. It is not a priori clear to which extent the input-output mapping of biophysically more realistic, spiking neuron models can be reduced to a simple linear-nonlinear cascade. Here we investigate this question for the leaky integrate-and-fire (LIF), exponential integrate-and-fire (EIF) and conductance-based Wang-Buzsáki models in presence of background synaptic activity. We exploit available analytic results for these models to determine the corresponding linear filter and static non-linearity in a parameter-free form. We show that the obtained functions are identical to the linear filter and static non-linearity determined using standard reverse correlation analysis. We then quantitatively compare the output of the corresponding linear-nonlinear cascade with numerical simulations of spiking neurons, systematically varying the parameters of input signal and background noise. We find that the LN cascade provides accurate estimates of the firing rates of spiking neurons in most of parameter space. For the EIF and Wang-Buzsáki models, we show that the LN cascade can be reduced to a firing rate model, the timescale of which we determine analytically. Finally we introduce an adaptive timescale rate model in which the timescale of the linear filter depends on the instantaneous firing rate. This model leads to highly accurate estimates of instantaneous firing rates

    History-Dependent Excitability as a Single-Cell Substrate of Transient Memory for Information Discrimination

    Get PDF
    Neurons react differently to incoming stimuli depending upon their previous history of stimulation. This property can be considered as a single-cell substrate for transient memory, or context-dependent information processing: depending upon the current context that the neuron “sees” through the subset of the network impinging on it in the immediate past, the same synaptic event can evoke a postsynaptic spike or just a subthreshold depolarization. We propose a formal definition of History-Dependent Excitability (HDE) as a measure of the propensity to firing in any moment in time, linking the subthreshold history-dependent dynamics with spike generation. This definition allows the quantitative assessment of the intrinsic memory for different single-neuron dynamics and input statistics. We illustrate the concept of HDE by considering two general dynamical mechanisms: the passive behavior of an Integrate and Fire (IF) neuron, and the inductive behavior of a Generalized Integrate and Fire (GIF) neuron with subthreshold damped oscillations. This framework allows us to characterize the sensitivity of different model neurons to the detailed temporal structure of incoming stimuli. While a neuron with intrinsic oscillations discriminates equally well between input trains with the same or different frequency, a passive neuron discriminates better between inputs with different frequencies. This suggests that passive neurons are better suited to rate-based computation, while neurons with subthreshold oscillations are advantageous in a temporal coding scheme. We also address the influence of intrinsic properties in single-cell processing as a function of input statistics, and show that intrinsic oscillations enhance discrimination sensitivity at high input rates. Finally, we discuss how the recognition of these cell-specific discrimination properties might further our understanding of neuronal network computations and their relationships to the distribution and functional connectivity of different neuronal types

    Efficient Network Reconstruction from Dynamical Cascades Identifies Small-World Topology of Neuronal Avalanches

    Get PDF
    Cascading activity is commonly found in complex systems with directed interactions such as metabolic networks, neuronal networks, or disease spreading in social networks. Substantial insight into a system's organization can be obtained by reconstructing the underlying functional network architecture from the observed activity cascades. Here we focus on Bayesian approaches and reduce their computational demands by introducing the Iterative Bayesian (IB) and Posterior Weighted Averaging (PWA) methods. We introduce a special case of PWA, cast in nonparametric form, which we call the normalized count (NC) algorithm. NC efficiently reconstructs random and small-world functional network topologies and architectures from subcritical, critical, and supercritical cascading dynamics and yields significant improvements over commonly used correlation methods. With experimental data, NC identified a functional and structural small-world topology and its corresponding traffic in cortical networks with neuronal avalanche dynamics

    A reafferent and feed-forward model of song syntax generation in the Bengalese finch

    Get PDF
    Adult Bengalese finches generate a variable song that obeys a distinct and individual syntax. The syntax is gradually lost over a period of days after deafening and is recovered when hearing is restored. We present a spiking neuronal network model of the song syntax generation and its loss, based on the assumption that the syntax is stored in reafferent connections from the auditory to the motor control area. Propagating synfire activity in the HVC codes for individual syllables of the song and priming signals from the auditory network reduce the competition between syllables to allow only those transitions that are permitted by the syntax. Both imprinting of song syntax within HVC and the interaction of the reafferent signal with an efference copy of the motor command are sufficient to explain the gradual loss of syntax in the absence of auditory feedback. The model also reproduces for the first time experimental findings on the influence of altered auditory feedback on the song syntax generation, and predicts song- and species-specific low frequency components in the LFP. This study illustrates how sequential compositionality following a defined syntax can be realized in networks of spiking neurons
    corecore