22 research outputs found

    Detecting and Estimating Signals in Noisy Cable Structures, II: Information Theoretical Analysis

    Get PDF
    This is the second in a series of articles that seek to recast classical single-neuron biophysics in information-theoretical terms. Classical cable theory focuses on analyzing the voltage or current attenuation of a synaptic signal as it propagates from its dendritic input location to the spike initiation zone. On the other hand, we are interested in analyzing the amount of information lost about the signal in this process due to the presence of various noise sources distributed throughout the neuronal membrane. We use a stochastic version of the linear one-dimensional cable equation to derive closed-form expressions for the second-order moments of the fluctuations of the membrane potential associated with different membrane current noise sources: thermal noise, noise due to the random opening and closing of sodium and potassium channels, and noise due to the presence of “spontaneous” synaptic input. We consider two different scenarios. In the signal estimation paradigm, the time course of the membrane potential at a location on the cable is used to reconstruct the detailed time course of a random, band-limited current injected some distance away. Estimation performance is characterized in terms of the coding fraction and the mutual information. In the signal detection paradigm, the membrane potential is used to determine whether a distant synaptic event occurred within a given observation interval. In the light of our analytical results, we speculate that the length of weakly active apical dendrites might be limited by the information loss due to the accumulated noise between distal synaptic input sites and the soma and that the presence of dendritic nonlinearities probably serves to increase dendritic information transfer

    Detecting and Estimating Signals over Noisy and Unreliable Synapses: Information-Theoretic Analysis

    Get PDF
    The temporal precision with which neurons respond to synaptic inputs has a direct bearing on the nature of the neural code. A characterization of the neuronal noise sources associated with different sub-cellular components (synapse, dendrite, soma, axon, and so on) is needed to understand the relationship between noise and information transfer. Here we study the effect of the unreliable, probabilistic nature of synaptic transmission on information transfer in the absence of interaction among presynaptic inputs. We derive theoretical lower bounds on the capacity of a simple model of a cortical synapse under two different paradigms. In signal estimation, the signal is assumed to be encoded in the mean firing rate of the presynaptic neuron, and the objective is to estimate the continuous input signal from the postsynaptic voltage. In signal detection, the input is binary, and the presence or absence of a presynaptic action potential is to be detected from the postsynaptic voltage. The efficacy of information transfer in synaptic transmission is characterized by deriving optimal strategies under these two paradigms. On the basis of parameter values derived from neocortex, we find that single cortical synapses cannot transmit information reliably, but redundancy obtained using a small number of multiple synapses leads to a significant improvement in the information capacity of synaptic transmission

    Detecting and Estimating Signals in Noisy Cable Structures, I: Neuronal Noise Sources

    Get PDF
    In recent theoretical approaches addressing the problem of neural coding, tools from statistical estimation and information theory have been applied to quantify the ability of neurons to transmit information through their spike outputs. These techniques, though fairly general, ignore the specific nature of neuronal processing in terms of its known biophysical properties. However, a systematic study of processing at various stages in a biophysically faithful model of a single neuron can identify the role of each stage in information transfer. Toward this end, we carry out a theoretical analysis of the information loss of a synaptic signal propagating along a linear, one-dimensional, weakly active cable due to neuronal noise sources along the way, using both a signal reconstruction and a signal detection paradigm. Here we begin such an analysis by quantitatively characterizing three sources of membrane noise: (1) thermal noise due to the passive membrane resistance, (2) noise due to stochastic openings and closings of voltage-gated membrane channels (Na^+ and K^+), and (3) noise due to random, background synaptic activity. Using analytical expressions for the power spectral densities of these noise sources, we compare their magnitudes in the case of a patch of membrane from a cortical pyramidal cell and explore their dependence on different biophysical parameters

    The impact of spike timing variability on the signal-encoding performance of neural spiking models

    Get PDF
    It remains unclear whether the variability of neuronal spike trains in vivo arises due to biological noise sources or represents highly precise encoding of temporally varying synaptic input signals. Determining the variability of spike timing can provide fundamental insights into the nature of strategies used in the brain to represent and transmit information in the form of discrete spike trains. In this study, we employ a signal estimation paradigm to determine how variability in spike timing affects encoding of random time-varying signals. We assess this for two types of spiking models: an integrate-and-fire model with random threshold and a more biophysically realistic stochastic ion channel model. Using the coding fraction and mutual information as information-theoretic measures, we quantify the efficacy of optimal linear decoding of random inputs from the model outputs and study the relationship between efficacy and variability in the output spike train. Our findings suggest that variability does not necessarily hinder signal decoding for the biophysically plausible encoders examined and that the functional role of spiking variability depends intimately on the nature of the encoder and the signal processing task; variability can either enhance or impede decoding performance

    Journal of Near-Death Studies

    Get PDF
    Here we derive measures quantifying the information loss of a synaptic signal due to the presence of neuronal noise sources, as it electrotonically propagates along a weakly-active dendrite. We model the dendrite as an infinite linear cable, with noise sources distributed along its length. The noise sources we consider are thermal noise, channel noise arising from the stochastic nature of voltage-dependent ionic channels (K^+ and Na^+) and synaptic noise due to spontaneous background activity. We assess the efficacy of information transfer using a signal detection paradigm where the objective is to detect the presence/absence of a presynaptic spike from the post-synaptic membrane voltage. This allows us to analytically assess the role of each of these noise sources in information transfer. For our choice of parameters, we find that the synaptic noise is the dominant noise source which limits the maximum length over which information be reliably transmitted

    Channel noise in excitable neuronal membranes

    Get PDF
    Stochastic fluctuations of voltage-gated ion channels generate current and voltage noise in neuronal membranes. This noise may be a critical determinant of the efficacy of information processing within neural systems. Using Monte-Carlo simulations, we carry out a systematic investigation of the relationship between channel kinetics and the resulting membrane voltage noise using a stochastic Markov version of the Mainen-Sejnowski model of dendritic excitability in cortical neurons. Our simulations show that kinetic parameters which lead to an increase in membrane excitability (increasing channel densities, decreasing temperature) also lead to an increase in the magnitude of the sub-threshold voltage noise. Noise also increases as the membrane is depolarized from rest towards threshold. This suggests that channel fluctuations may interfere with a neuron’s ability to function as an integrator of its synaptic inputs and may limit the reliability and precision of neural information processing

    Variability and coding efficiency of noisy neural spike encoders

    Get PDF
    Encoding synaptic inputs as a train of action potentials is a fundamental function of nerve cells. Although spike trains recorded in vivo have been shown to be highly variable, it is unclear whether variability in spike timing represents faithful encoding of temporally varying synaptic inputs or noise inherent in the spike encoding mechanism. It has been reported that spike timing variability is more pronounced for constant, unvarying inputs than for inputs with rich temporal structure. This could have significant implications for the nature of neural coding, particularly if precise timing of spikes and temporal synchrony between neurons is used to represent information in the nervous system. To study the potential functional role of spike timing variability, we estimate the fraction of spike timing variability which conveys information about the input for two types of noisy spike encoders — an integrate and fire model with randomly chosen thresholds and a model of a patch of neuronal membrane containing stochastic Na+ and K+ channels obeying Hodgkin–Huxley kinetics. The quality of signal encoding is assessed by reconstructing the input stimuli from the output spike trains using optimal linear mean square estimation. A comparison of the estimation performance of noisy neuronal models of spike generation enables us to assess the impact of neuronal noise on the efficacy of neural coding. The results for both models suggest that spike timing variability reduces the ability of spike trains to encode rapid time-varying stimuli. Moreover, contrary to expectations based on earlier studies, we find that the noisy spike encoding models encode slowly varying stimuli more effectively than rapidly varying ones

    Subthreshold Voltage Noise Due to Channel Fluctuations in Active Neuronal Membranes

    Get PDF
    Voltage-gated ion channels in neuronal membranes fluctuate randomly between different conformational states due to thermal agitation. Fluctuations between conducting and nonconducting states give rise to noisy membrane currents and subthreshold voltage fluctuations and may contribute to variability in spike timing. Here we study subthreshold voltage fluctuations due to active voltage-gated Na+ and K+ channels as predicted by two commonly used kinetic schemes: the Mainen et al. (1995) (MJHS) kinetic scheme, which has been used to model dendritic channels in cortical neurons, and the classical Hodgkin-Huxley (1952) (HH) kinetic scheme for the squid giant axon. We compute the magnitudes, amplitude distributions, and power spectral densities of the voltage noise in isopotential membrane patches predicted by these kinetic schemes. For both schemes, noise magnitudes increase rapidly with depolarization from rest. Noise is larger for smaller patch areas but is smaller for increased model temperatures. We contrast the results from Monte Carlo simulations of the stochastic nonlinear kinetic schemes with analytical, closed-form expressions derived using passive and quasi-active linear approximations to the kinetic schemes. For all subthreshold voltage ranges, the quasi-active linearized approximation is accurate within 8% and may thus be used in large-scale simulations of realistic neuronal geometries

    Transient Responses to Rapid Changes in Mean and Variance in Spiking Models

    Get PDF
    The mean input and variance of the total synaptic input to a neuron can vary independently, suggesting two distinct information channels. Here we examine the impact of rapidly varying signals, delivered via these two information conduits, on the temporal dynamics of neuronal firing rate responses. We examine the responses of model neurons to step functions in either the mean or the variance of the input current. Our results show that the temporal dynamics governing response onset depends on the choice of model. Specifically, the existence of a hard threshold introduces an instantaneous component into the response onset of a leaky-integrate-and-fire model that is not present in other models studied here. Other response features, for example a decaying oscillatory approach to a new steady-state firing rate, appear to be more universal among neuronal models. The decay time constant of this approach is a power-law function of noise magnitude over a wide range of input parameters. Understanding how specific model properties underlie these response features is important for understanding how neurons will respond to rapidly varying signals, as the temporal dynamics of the response onset and response decay to new steady-state determine what range of signal frequencies a population of neurons can respond to and faithfully encode

    Information-theoretic analysis of neuronal communication

    Get PDF
    One of the most fundamental functions of brains is to process information. Whether we are engaged in tasks like reading a book, listening to our favorite music station on radio, smelling a flower in bloom or relishing our favorite gourmet cuisine, we invariably employ our brains to process the information received through our senses and create a perception of the world around us. The physical signals incident on our sensory organs, either in the form of photon fluxes, acoustic vibrations, or plumes of chemical concentrations, are transduced, represented and processed as electrical signals within our brains. One of the essential inquiries in neuroscience is the nature of this representation of information in the brain. This is often referred to as the "neural coding" problem which has been and continues to be the object of a lot of theoretical and experimental scientific effort. In most theoretical approaches that address the problem, nerve cells are characterized empirically by collection of their input-output responses. The knowledge of constraints imposed on information processing due to biophysics of the underlying biological hardware is generally ignored. This thesis reports the outcome of our efforts to combine techniques from stochastic processes, information theory and single neuron biophysics to unravel the neural coding problem. We believe that a systematic reductionist analysis which takes into account the extant noise due to biological processes specific to neuronal processing will provide fundamental insights overlooked in earlier approaches. We analytically characterize the sources of biological noise associated with different stages in the neuronal information pathway, namely the synapse, the dendritic tree and the spike-initiation zone and employ information-theoretical measures to compute the ability of these components to transmit information in specific signal processing tasks. For analytical tractability, we demonstrate our results using abstract and simplified mathematical models. However, our approach can be readily applied to realistic and complicated descriptions of single neurons to provide a greater understanding of the role of noise in neuronal communication
    corecore