291 research outputs found

    A three-tiered study of differences in murine intrahost immune response to multiple pneumococcal strains

    Get PDF
    We apply a previously developed 4-variable ordinary differential equation model of in-host immune response to pneumococcal pneumonia to study the variability of the immune response of MF1 mice and to explore bacteria-driven differences in disease progression and outcome. In particular, we study the immune response to D39 strain of bacteria missing portions of the pneumolysin protein controlling either the hemolytic activity or complement-activating activity, the response to D39 bacteria deficient in either neuraminidase A or B, and the differences in the response to D39 (serotype 2), 0100993 (serotype 3), and TIGR4 (serotype 4) bacteria. The model accurately reproduces infection kinetics in all cases and provides information about which mechanisms in the immune response have the greatest effect in each case. Results suggest that differences in the ability of bacteria to defeat immune response are primarily due to the ability of the bacteria to elude nonspecific clearance in the lung tissue as well as the ability to create damage to the lung epithelium

    Impact of Adaptation Currents on Synchronization of Coupled Exponential Integrate-and-Fire Neurons

    Get PDF
    The ability of spiking neurons to synchronize their activity in a network depends on the response behavior of these neurons as quantified by the phase response curve (PRC) and on coupling properties. The PRC characterizes the effects of transient inputs on spike timing and can be measured experimentally. Here we use the adaptive exponential integrate-and-fire (aEIF) neuron model to determine how subthreshold and spike-triggered slow adaptation currents shape the PRC. Based on that, we predict how synchrony and phase locked states of coupled neurons change in presence of synaptic delays and unequal coupling strengths. We find that increased subthreshold adaptation currents cause a transition of the PRC from only phase advances to phase advances and delays in response to excitatory perturbations. Increased spike-triggered adaptation currents on the other hand predominantly skew the PRC to the right. Both adaptation induced changes of the PRC are modulated by spike frequency, being more prominent at lower frequencies. Applying phase reduction theory, we show that subthreshold adaptation stabilizes synchrony for pairs of coupled excitatory neurons, while spike-triggered adaptation causes locking with a small phase difference, as long as synaptic heterogeneities are negligible. For inhibitory pairs synchrony is stable and robust against conduction delays, and adaptation can mediate bistability of in-phase and anti-phase locking. We further demonstrate that stable synchrony and bistable in/anti-phase locking of pairs carry over to synchronization and clustering of larger networks. The effects of adaptation in aEIF neurons on PRCs and network dynamics qualitatively reflect those of biophysical adaptation currents in detailed Hodgkin-Huxley-based neurons, which underscores the utility of the aEIF model for investigating the dynamical behavior of networks. Our results suggest neuronal spike frequency adaptation as a mechanism synchronizing low frequency oscillations in local excitatory networks, but indicate that inhibition rather than excitation generates coherent rhythms at higher frequencies

    Reduced dynamics and symmetric solutions for globally coupled weakly dissipative oscillators

    Get PDF
    This is a preprint of an article whose final and definitive form has been published in DYNAMICAL SYSTEMS © 2005 copyright Taylor & Francis; DYNAMICAL SYSTEMS is available online at: http://www.informaworld.com/openurl?genre=article&issn=1468-9367&volume=20&issue=3&spage=333Systems of coupled oscillators may exhibit spontaneous dynamical formation of attracting synchronized clusters with broken symmetry; this can be helpful in modelling various physical processes. Analytical computation of the stability of synchronized cluster states is usually impossible for arbitrary nonlinear oscillators. In this paper we examine a particular class of strongly nonlinear oscillators that are analytically tractable. We examine the effect of isochronicity (a turning point in the dependence of period on energy) of periodic oscillators on clustered states of globally coupled oscillator networks. We extend previous work on networks of weakly dissipative globally coupled nonlinear Hamiltonian oscillators to give conditions for the existence and stability of certain clustered periodic states under the assumption that dissipation and coupling are small and of similar order. This is verified by numerical simulations on an example system of oscillators that are weakly dissipative perturbations of a planar Hamiltonian oscillator with a quartic potential. Finally we use the reduced phase-energy model derived from the weakly dissipative case to motivate a new class of phase-energy models that can be usefully employed for understanding effects such as clustering and torus breakup in more general coupled oscillator systems. We see that the property of isochronicity usefully generalizes to such systems, and we examine some examples of their attracting dynamics

    Intrinsic gain modulation and adaptive neural coding

    Get PDF
    In many cases, the computation of a neural system can be reduced to a receptive field, or a set of linear filters, and a thresholding function, or gain curve, which determines the firing probability; this is known as a linear/nonlinear model. In some forms of sensory adaptation, these linear filters and gain curve adjust very rapidly to changes in the variance of a randomly varying driving input. An apparently similar but previously unrelated issue is the observation of gain control by background noise in cortical neurons: the slope of the firing rate vs current (f-I) curve changes with the variance of background random input. Here, we show a direct correspondence between these two observations by relating variance-dependent changes in the gain of f-I curves to characteristics of the changing empirical linear/nonlinear model obtained by sampling. In the case that the underlying system is fixed, we derive relationships relating the change of the gain with respect to both mean and variance with the receptive fields derived from reverse correlation on a white noise stimulus. Using two conductance-based model neurons that display distinct gain modulation properties through a simple change in parameters, we show that coding properties of both these models quantitatively satisfy the predicted relationships. Our results describe how both variance-dependent gain modulation and adaptive neural computation result from intrinsic nonlinearity.Comment: 24 pages, 4 figures, 1 supporting informatio

    Finite-size and correlation-induced effects in Mean-field Dynamics

    Full text link
    The brain's activity is characterized by the interaction of a very large number of neurons that are strongly affected by noise. However, signals often arise at macroscopic scales integrating the effect of many neurons into a reliable pattern of activity. In order to study such large neuronal assemblies, one is often led to derive mean-field limits summarizing the effect of the interaction of a large number of neurons into an effective signal. Classical mean-field approaches consider the evolution of a deterministic variable, the mean activity, thus neglecting the stochastic nature of neural behavior. In this article, we build upon two recent approaches that include correlations and higher order moments in mean-field equations, and study how these stochastic effects influence the solutions of the mean-field equations, both in the limit of an infinite number of neurons and for large yet finite networks. We introduce a new model, the infinite model, which arises from both equations by a rescaling of the variables and, which is invertible for finite-size networks, and hence, provides equivalent equations to those previously derived models. The study of this model allows us to understand qualitative behavior of such large-scale networks. We show that, though the solutions of the deterministic mean-field equation constitute uncorrelated solutions of the new mean-field equations, the stability properties of limit cycles are modified by the presence of correlations, and additional non-trivial behaviors including periodic orbits appear when there were none in the mean field. The origin of all these behaviors is then explored in finite-size networks where interesting mesoscopic scale effects appear. This study leads us to show that the infinite-size system appears as a singular limit of the network equations, and for any finite network, the system will differ from the infinite system

    A Model for the Origin and Properties of Flicker-Induced Geometric Phosphenes

    Get PDF
    We present a model for flicker phosphenes, the spontaneous appearance of geometric patterns in the visual field when a subject is exposed to diffuse flickering light. We suggest that the phenomenon results from interaction of cortical lateral inhibition with resonant periodic stimuli. We find that the best temporal frequency for eliciting phosphenes is a multiple of intrinsic (damped) oscillatory rhythms in the cortex. We show how both the quantitative and qualitative aspects of the patterns change with frequency of stimulation and provide an explanation for these differences. We use Floquet theory combined with the theory of pattern formation to derive the parameter regimes where the phosphenes occur. We use symmetric bifurcation theory to show why low frequency flicker should produce hexagonal patterns while high frequency produces pinwheels, targets, and spirals

    A New Approach for Determining Phase Response Curves Reveals that Purkinje Cells Can Act as Perfect Integrators

    Get PDF
    Cerebellar Purkinje cells display complex intrinsic dynamics. They fire spontaneously, exhibit bistability, and via mutual network interactions are involved in the generation of high frequency oscillations and travelling waves of activity. To probe the dynamical properties of Purkinje cells we measured their phase response curves (PRCs). PRCs quantify the change in spike phase caused by a stimulus as a function of its temporal position within the interspike interval, and are widely used to predict neuronal responses to more complex stimulus patterns. Significant variability in the interspike interval during spontaneous firing can lead to PRCs with a low signal-to-noise ratio, requiring averaging over thousands of trials. We show using electrophysiological experiments and simulations that the PRC calculated in the traditional way by sampling the interspike interval with brief current pulses is biased. We introduce a corrected approach for calculating PRCs which eliminates this bias. Using our new approach, we show that Purkinje cell PRCs change qualitatively depending on the firing frequency of the cell. At high firing rates, Purkinje cells exhibit single-peaked, or monophasic PRCs. Surprisingly, at low firing rates, Purkinje cell PRCs are largely independent of phase, resembling PRCs of ideal non-leaky integrate-and-fire neurons. These results indicate that Purkinje cells can act as perfect integrators at low firing rates, and that the integration mode of Purkinje cells depends on their firing rate
    corecore