1,090 research outputs found

    Complementary responses to mean and variance modulations in the perfect integrate-and-fire model

    Get PDF
    In the perfect integrate-and-fire model (PIF), the membrane voltage is proportional to the integral of the input current since the time of the previous spike. It has been shown that the firing rate within a noise free ensemble of PIF neurons responds instantaneously to dynamic changes in the input current, whereas in the presence of white noise, model neurons preferentially pass low frequency modulations of the mean current. Here, we prove that when the input variance is perturbed while holding the mean current constant, the PIF responds preferentially to high frequency modulations. Moreover, the linear filters for mean and variance modulations are complementary, adding exactly to one. Since changes in the rate of Poisson distributed inputs lead to proportional changes in the mean and variance, these results imply that an ensemble of PIF neurons transmits a perfect replica of the time-varying input rate for Poisson distributed input. A more general argument shows that this property holds for any signal leading to proportional changes in the mean and variance of the input current

    The Dynamics of Integrate-and-Fire: Mean vs. Variance Modulations and Dependence on Baseline Parameters

    Get PDF
    The leaky integrate-and-fire (LIF) is the simplest neuron model that captures the essential properties of neuronal signaling. Yet common intuitions are inadequate to explain basic properties of LIF responses to sinusoidal modulations of the input. Here we examine responses to low - and moderate-frequency modulations of both the mean and variance of the input current and quantify how these responses depend on baseline parameters. Across parameters, responses to modulations in the mean current are low pass, approaching zero in the limit of high frequencies. For very low baseline firing rates, the response cutoff frequency matches that expected from membrane integration. However, the cutoff shows a rapid, supralinear increase with firing rate, with a steeper increase in the case of lower noise. For modulations of the input variance, the gain at high frequency remains finite. Here, we show that the low-frequency responses depend strongly on baseline parameters and derive an analytic condition specifying the parameters at which responses switch from being dominated by low versus high frequencies. Additionally, we show that the resonant responses for variance modulations have properties not expected for common oscillatory resonances: they peak at frequencies higher than the baseline firing rate and persist when oscillatory spiking is disrupted by high noise. Finally, the responses to mean and variance modulations are shown to have a complementary dependence on baseline parameters at higher frequencies, resulting in responses to modulations of Poisson input rates that are independent of baseline input statistics

    Representation of Dynamical Stimuli in Populations of Threshold Neurons

    Get PDF
    Many sensory or cognitive events are associated with dynamic current modulations in cortical neurons. This raises an urgent demand for tractable model approaches addressing the merits and limits of potential encoding strategies. Yet, current theoretical approaches addressing the response to mean- and variance-encoded stimuli rarely provide complete response functions for both modes of encoding in the presence of correlated noise. Here, we investigate the neuronal population response to dynamical modifications of the mean or variance of the synaptic bombardment using an alternative threshold model framework. In the variance and mean channel, we provide explicit expressions for the linear and non-linear frequency response functions in the presence of correlated noise and use them to derive population rate response to step-like stimuli. For mean-encoded signals, we find that the complete response function depends only on the temporal width of the input correlation function, but not on other functional specifics. Furthermore, we show that both mean- and variance-encoded signals can relay high-frequency inputs, and in both schemes step-like changes can be detected instantaneously. Finally, we obtain the pairwise spike correlation function and the spike triggered average from the linear mean-evoked response function. These results provide a maximally tractable limiting case that complements and extends previous results obtained in the integrate and fire framework

    Rate Dynamics of Leaky Integrate-and-Fire Neurons with Strong Synapses

    Get PDF
    Firing-rate models provide a practical tool for studying the dynamics of trial- or population-averaged neuronal signals. A wealth of theoretical and experimental studies has been dedicated to the derivation or extraction of such models by investigating the firing-rate response characteristics of ensembles of neurons. The majority of these studies assumes that neurons receive input spikes at a high rate through weak synapses (diffusion approximation). For many biological neural systems, however, this assumption cannot be justified. So far, it is unclear how time-varying presynaptic firing rates are transmitted by a population of neurons if the diffusion assumption is dropped. Here, we numerically investigate the stationary and non-stationary firing-rate response properties of leaky integrate-and-fire neurons receiving input spikes through excitatory synapses with alpha-function shaped postsynaptic currents for strong synaptic weights. Input spike trains are modeled by inhomogeneous Poisson point processes with sinusoidal rate. Average rates, modulation amplitudes, and phases of the period-averaged spike responses are measured for a broad range of stimulus, synapse, and neuron parameters. Across wide parameter regions, the resulting transfer functions can be approximated by a linear first-order low-pass filter. Below a critical synaptic weight, the cutoff frequencies are approximately constant and determined by the synaptic time constants. Only for synapses with unrealistically strong weights are the cutoff frequencies significantly increased. To account for stimuli with larger modulation depths, we combine the measured linear transfer function with the nonlinear response characteristics obtained for stationary inputs. The resulting linear–nonlinear model accurately predicts the population response for a variety of non-sinusoidal stimuli

    Decorrelation of neural-network activity by inhibitory feedback

    Get PDF
    Correlations in spike-train ensembles can seriously impair the encoding of information by their spatio-temporal structure. An inevitable source of correlation in finite neural networks is common presynaptic input to pairs of neurons. Recent theoretical and experimental studies demonstrate that spike correlations in recurrent neural networks are considerably smaller than expected based on the amount of shared presynaptic input. By means of a linear network model and simulations of networks of leaky integrate-and-fire neurons, we show that shared-input correlations are efficiently suppressed by inhibitory feedback. To elucidate the effect of feedback, we compare the responses of the intact recurrent network and systems where the statistics of the feedback channel is perturbed. The suppression of spike-train correlations and population-rate fluctuations by inhibitory feedback can be observed both in purely inhibitory and in excitatory-inhibitory networks. The effect is fully understood by a linear theory and becomes already apparent at the macroscopic level of the population averaged activity. At the microscopic level, shared-input correlations are suppressed by spike-train correlations: In purely inhibitory networks, they are canceled by negative spike-train correlations. In excitatory-inhibitory networks, spike-train correlations are typically positive. Here, the suppression of input correlations is not a result of the mere existence of correlations between excitatory (E) and inhibitory (I) neurons, but a consequence of a particular structure of correlations among the three possible pairings (EE, EI, II)

    Inhibitory synchrony as a mechanism for attentional gain modulation

    Get PDF
    Recordings from area V4 of monkeys have revealed that when the focus of attention is on a visual stimulus within the receptive field of a cortical neuron, two distinct changes can occur: The firing rate of the neuron can change and there can be an increase in the coherence between spikes and the local field potential in the gamma-frequency range (30-50 Hz). The hypothesis explored here is that these observed effects of attention could be a consequence of changes in the synchrony of local interneuron networks. We performed computer simulations of a Hodgkin-Huxley type neuron driven by a constant depolarizing current, I, representing visual stimulation and a modulatory inhibitory input representing the effects of attention via local interneuron networks. We observed that the neuron's firing rate and the coherence of its output spike train with the synaptic inputs was modulated by the degree of synchrony of the inhibitory inputs. The model suggest that the observed changes in firing rate and coherence of neurons in the visual cortex could be controlled by top-down inputs that regulated the coherence in the activity of a local inhibitory network discharging at gamma frequencies.Comment: J.Physiology (Paris) in press, 11 figure

    Learning Contrast-Invariant Cancellation of Redundant Signals in Neural Systems

    Get PDF
    Cancellation of redundant information is a highly desirable feature of sensory systems, since it would potentially lead to a more efficient detection of novel information. However, biologically plausible mechanisms responsible for such selective cancellation, and especially those robust to realistic variations in the intensity of the redundant signals, are mostly unknown. In this work, we study, via in vivo experimental recordings and computational models, the behavior of a cerebellar-like circuit in the weakly electric fish which is known to perform cancellation of redundant stimuli. We experimentally observe contrast invariance in the cancellation of spatially and temporally redundant stimuli in such a system. Our model, which incorporates heterogeneously-delayed feedback, bursting dynamics and burst-induced STDP, is in agreement with our in vivo observations. In addition, the model gives insight on the activity of granule cells and parallel fibers involved in the feedback pathway, and provides a strong prediction on the parallel fiber potentiation time scale. Finally, our model predicts the existence of an optimal learning contrast around 15% contrast levels, which are commonly experienced by interacting fish

    Near-Perfect Synaptic Integration by Na(v)1.7 in Hypothalamic Neurons Regulates Body Weight

    Get PDF
    Neurons are well suited for computations on millisecond timescales, but some neuronal circuits set behavioral states over long time periods, such as those involved in energy homeostasis. We found that multiple types of hypothalamic neurons, including those that oppositely regulate body weight, are specialized as near-perfect synaptic integrators that summate inputs over extended timescales. Excitatory postsynaptic potentials (EPSPs) are greatly prolonged, outlasting the neuronal membrane time-constant up to 10-fold. This is due to the voltage-gated sodium channel Nav1.7 (Scn9a), previously associated with pain-sensation but not synaptic integration. Scn9a deletion in AGRP, POMC, or paraventricular hypothalamic neurons reduced EPSP duration, synaptic integration, and altered body weight in mice. In vivo whole-cell recordings in the hypothalamus confirmed near-perfect synaptic integration. These experiments show that integration of synaptic inputs over time by Nav1.7 is critical for body weight regulation and reveal a mechanism for synaptic control of circuits regulating long term homeostatic functions

    Context-dependent selection of visuomotor maps

    Get PDF
    Behavior results from the integration of ongoing sensory signals and contextual information in various forms, such as past experience, expectations, current goals, etc. Thus, the response to a specific stimulus, say the ringing of a doorbell, varies depending on whether you are at home or in someone else's house. What is the neural basis of this flexibility? What mechanism is capable of selecting, in a context-dependent way, an adequate response to a given stimulus? One possibility is based on a nonlinear neural representation in which context information regulates the gain of stimulus-evoked responses. Here I explore the properties of this mechanism. By means of three hypothetical visuomotor tasks, I study a class of neural network models in which any one of several possible stimulus-response maps or rules can be selected according to context. The underlying mechanism based on gain modulation has three key features: (1) modulating the sensory responses is equivalent to switching on or off different subpopulations of neurons, (2) context does not need to be represented continuously, although this is advantageous for generalization, and (3) context-dependent selection is independent of the discriminability of the stimuli. In all cases, the contextual cues can quickly turn on or off a sensory-motor map, effectively changing the functional connectivity between inputs and outputs in the networks. The model predicts that sensory responses that are nonlinearly modulated by arbitrary context signals should be found in behavioral situations that involve choosing or switching between multiple sensory-motor maps.Comment: 22 pages, 10 figures. Article available from: http://www.biomedcentral.com/1471-2202/5/4
    • …
    corecore