547 research outputs found

    Time Resolution Dependence of Information Measures for Spiking Neurons: Atoms, Scaling, and Universality

    Full text link
    The mutual information between stimulus and spike-train response is commonly used to monitor neural coding efficiency, but neuronal computation broadly conceived requires more refined and targeted information measures of input-output joint processes. A first step towards that larger goal is to develop information measures for individual output processes, including information generation (entropy rate), stored information (statistical complexity), predictable information (excess entropy), and active information accumulation (bound information rate). We calculate these for spike trains generated by a variety of noise-driven integrate-and-fire neurons as a function of time resolution and for alternating renewal processes. We show that their time-resolution dependence reveals coarse-grained structural properties of interspike interval statistics; e.g., Ï„\tau-entropy rates that diverge less quickly than the firing rate indicate interspike interval correlations. We also find evidence that the excess entropy and regularized statistical complexity of different types of integrate-and-fire neurons are universal in the continuous-time limit in the sense that they do not depend on mechanism details. This suggests a surprising simplicity in the spike trains generated by these model neurons. Interestingly, neurons with gamma-distributed ISIs and neurons whose spike trains are alternating renewal processes do not fall into the same universality class. These results lead to two conclusions. First, the dependence of information measures on time resolution reveals mechanistic details about spike train generation. Second, information measures can be used as model selection tools for analyzing spike train processes.Comment: 20 pages, 6 figures; http://csc.ucdavis.edu/~cmg/compmech/pubs/trdctim.ht

    Response variability in balanced cortical networks

    Full text link
    We study the spike statistics of neurons in a network with dynamically balanced excitation and inhibition. Our model, intended to represent a generic cortical column, comprises randomly connected excitatory and inhibitory leaky integrate-and-fire neurons, driven by excitatory input from an external population. The high connectivity permits a mean-field description in which synaptic currents can be treated as Gaussian noise, the mean and autocorrelation function of which are calculated self-consistently from the firing statistics of single model neurons. Within this description, we find that the irregularity of spike trains is controlled mainly by the strength of the synapses relative to the difference between the firing threshold and the post-firing reset level of the membrane potential. For moderately strong synapses we find spike statistics very similar to those observed in primary visual cortex.Comment: 22 pages, 7 figures, submitted to Neural Computatio

    Stochastic and deterministic dynamics of intrinsically irregular firing in cortical inhibitory interneurons

    Get PDF
    Most cortical neurons fire regularly when excited by a constant stimulus. In contrast, irregular-spiking (IS) interneurons are remarkable for the intrinsic variability of their spike timing, which can synchronize amongst IS cells via specific gap junctions. Here, we have studied the biophysical mechanisms of this irregular spiking in mice, and how IS cells fire in the context of synchronous network oscillations. Using patch-clamp recordings, artificial dynamic conductance injection, pharmacological analysis and computational modeling, we show that spike time irregularity is generated by a nonlinear dynamical interaction of voltage-dependent sodium and fast-inactivating potassium channels just below spike threshold, amplifying channel noise. This active irregularity\textit{active irregularity} may help IS cells synchronize with each other at gamma range frequencies, while resisting synchronization to lower input frequencies.Biotechnology and Biological Sciences Research Council, Coordenação de Aperfeiçoamento de Pessoal de Nível Superior, Cambridge Overseas Trus

    Predicting spike timing of neocortical pyramidal neurons by simple threshold models

    Get PDF
    Neurons generate spikes reliably with millisecond precision if driven by a fluctuating current—is it then possible to predict the spike timing knowing the input? We determined parameters of an adapting threshold model using data recorded in vitro from 24 layer 5 pyramidal neurons from rat somatosensory cortex, stimulated intracellularly by a fluctuating current simulating synaptic bombardment in vivo. The model generates output spikes whenever the membrane voltage (a filtered version of the input current) reaches a dynamic threshold. We find that for input currents with large fluctuation amplitude, up to 75% of the spike times can be predicted with a precision of ±2ms. Some of the intrinsic neuronal unreliability can be accounted for by a noisy threshold mechanism. Our results suggest that, under random current injection into the soma, (i) neuronal behavior in the subthreshold regime can be well approximated by a simple linear filter; and (ii) most of the nonlinearities are captured by a simple threshold proces

    A comparative study of different integrate-and-fire neurons: spontaneous activity, dynamical response, and stimulus-induced correlation

    Full text link
    Stochastic integrate-and-fire (IF) neuron models have found widespread applications in computational neuroscience. Here we present results on the white-noise-driven perfect, leaky, and quadratic IF models, focusing on the spectral statistics (power spectra, cross spectra, and coherence functions) in different dynamical regimes (noise-induced and tonic firing regimes with low or moderate noise). We make the models comparable by tuning parameters such that the mean value and the coefficient of variation of the interspike interval match for all of them. We find that, under these conditions, the power spectrum under white-noise stimulation is often very similar while the response characteristics, described by the cross spectrum between a fraction of the input noise and the output spike train, can differ drastically. We also investigate how the spike trains of two neurons of the same kind (e.g. two leaky IF neurons) correlate if they share a common noise input. We show that, depending on the dynamical regime, either two quadratic IF models or two leaky IFs are more strongly correlated. Our results suggest that, when choosing among simple IF models for network simulations, the details of the model have a strong effect on correlation and regularity of the output.Comment: 12 page

    Membrane Properties and the Balance between Excitation and Inhibition Control Gamma-Frequency Oscillations Arising from Feedback Inhibition

    Get PDF
    Computational studies as well as in vivo and in vitro results have shown that many cortical neurons fire in a highly irregular manner and at low average firing rates. These patterns seem to persist even when highly rhythmic signals are recorded by local field potential electrodes or other methods that quantify the summed behavior of a local population. Models of the 30–80 Hz gamma rhythm in which network oscillations arise through ‘stochastic synchrony’ capture the variability observed in the spike output of single cells while preserving network-level organization. We extend upon these results by constructing model networks constrained by experimental measurements and using them to probe the effect of biophysical parameters on network-level activity. We find in simulations that gamma-frequency oscillations are enabled by a high level of incoherent synaptic conductance input, similar to the barrage of noisy synaptic input that cortical neurons have been shown to receive in vivo. This incoherent synaptic input increases the emergent network frequency by shortening the time scale of the membrane in excitatory neurons and by reducing the temporal separation between excitation and inhibition due to decreased spike latency in inhibitory neurons. These mechanisms are demonstrated in simulations and in vitro current-clamp and dynamic-clamp experiments. Simulation results further indicate that the membrane potential noise amplitude has a large impact on network frequency and that the balance between excitatory and inhibitory currents controls network stability and sensitivity to external inputs

    Information Encoding by Individual Neurons and Groups of Neurons in the Primary Visual Cortex

    Get PDF
    How is information about visual stimuli encoded into the responses of neurons in the cerebral cortex? In this thesis, I describe the analysis of data recorded simultaneously from groups of up to eight nearby neurons in the primary visual cortices of anesthetized macaque monkeys. The goal is to examine the degree to which visual information is encoded into the times of action potentials in those responses (as opposed to the overall rate), and also into the identity of the neuron that fires each action potential (as opposed to the average activity across a group of nearby neurons). The data are examined with techniques modified from systems analysis, statistics, and information theory. The results are compared with expectations from simple statistical models of action-potential firing and from models that are more physiologically realistic. The major findings are: (1) that cortical responses are not renewal processes with time-varying firing rates, which means that information can indeed be encoded in the detailed timing of action potentials; (2) that these neurons encode the contrast of visual stimuli primarily into the time difference between stimulus and response onset, which is known as the latency; (3) that this so-called temporal coding serves as a mechanism by which the brain might discriminate among stimuli that evoke similar firing rates; (4) that action potentials preceded by interspike intervals of different durations can encode different features of a stimulus; (5) that the rate of overall information transmission can depend on the type of stimulus in a manner that differs from one neuron to the next; (6) that the rate at which information is transmitted specifically about stimulus contrast depends little on stimulus type; (7) that a substantial fraction of the information rate can be confounded among multiple stimulus attributes; and, most importantly, (8) that averaging together the responses of multiple nearby neurons leads to a significant loss of information that increases as more neurons are considered. These results should serve as a basis for direct investigation into the cellular mechanisms by which the brain extracts and processes the information carried in neuronal responses
    • …
    corecore