85 research outputs found

    Noise in neurons is message-dependent

    Full text link
    Neuronal responses are conspicuously variable. We focus on one particular aspect of that variability: the precision of action potential timing. We show that for common models of noisy spike generation, elementary considerations imply that such variability is a function of the input, and can be made arbitrarily large or small by a suitable choice of inputs. Our considerations are expected to extend to virtually any mechanism of spike generation, and we illustrate them with data from the visual pathway. Thus, a simplification usually made in the application of information theory to neural processing is violated: noise {\sl is not independent of the message}. However, we also show the existence of {\sl error-correcting} topologies, which can achieve better timing reliability than their components.Comment: 6 pages,6 figures. Proceedings of the National Academy of Sciences (in press

    Information and Discriminability as Measures of Reliability of Sensory Coding

    Get PDF
    Response variability is a fundamental issue in neural coding because it limits all information processing. The reliability of neuronal coding is quantified by various approaches in different studies. In most cases it is largely unclear to what extent the conclusions depend on the applied reliability measure, making a comparison across studies almost impossible. We demonstrate that different reliability measures can lead to very different conclusions even if applied to the same set of data: in particular, we applied information theoretical measures (Shannon information capacity and Kullback-Leibler divergence) as well as a discrimination measure derived from signal-detection theory to the responses of blowfly photoreceptors which represent a well established model system for sensory information processing. We stimulated the photoreceptors with white noise modulated light intensity fluctuations of different contrasts. Surprisingly, the signal-detection approach leads to a safe discrimination of the photoreceptor response even when the response signal-to-noise ratio (SNR) is well below unity whereas Shannon information capacity and also Kullback-Leibler divergence indicate a very low performance. Applying different measures, can, therefore, lead to very different interpretations concerning the system's coding performance. As a consequence of the lower sensitivity compared to the signal-detection approach, the information theoretical measures overestimate internal noise sources and underestimate the importance of photon shot noise. We stress that none of the used measures and, most likely no other measure alone, allows for an unbiased estimation of a neuron's coding properties. Therefore the applied measure needs to be selected with respect to the scientific question and the analyzed neuron's functional context

    Network adaptation improves temporal representation of naturalistic stimuli in drosophila eye: II Mechanisms

    Get PDF
    Retinal networks must adapt constantly to best present the ever changing visual world to the brain. Here we test the hypothesis that adaptation is a result of different mechanisms at several synaptic connections within the network. In a companion paper (Part I), we showed that adaptation in the photoreceptors (R1-R6) and large monopolar cells (LMC) of the Drosophila eye improves sensitivity to under-represented signals in seconds by enhancing both the amplitude and frequency distribution of LMCs' voltage responses to repeated naturalistic contrast series. In this paper, we show that such adaptation needs both the light-mediated conductance and feedback-mediated synaptic conductance. A faulty feedforward pathway in histamine receptor mutant flies speeds up the LMC output, mimicking extreme light adaptation. A faulty feedback pathway from L2 LMCs to photoreceptors slows down the LMC output, mimicking dark adaptation. These results underline the importance of network adaptation for efficient coding, and as a mechanism for selectively regulating the size and speed of signals in neurons. We suggest that concert action of many different mechanisms and neural connections are responsible for adaptation to visual stimuli. Further, our results demonstrate the need for detailed circuit reconstructions like that of the Drosophila lamina, to understand how networks process information

    Visual Coding in Locust Photoreceptors

    Get PDF
    Information capture by photoreceptors ultimately limits the quality of visual processing in the brain. Using conventional sharp microelectrodes, we studied how locust photoreceptors encode random (white-noise, WN) and naturalistic (1/f stimuli, NS) light patterns in vivo and how this coding changes with mean illumination and ambient temperature. We also examined the role of their plasma membrane in shaping voltage responses. We found that brightening or warming increase and accelerate voltage responses, but reduce noise, enabling photoreceptors to encode more information. For WN stimuli, this was accompanied by broadening of the linear frequency range. On the contrary, with NS the signaling took place within a constant bandwidth, possibly revealing a ‘preference’ for inputs with 1/f statistics. The faster signaling was caused by acceleration of the elementary phototransduction current - leading to bumps - and their distribution. The membrane linearly translated phototransduction currents into voltage responses without limiting the throughput of these messages. As the bumps reflected fast changes in membrane resistance, the data suggest that their shape is predominantly driven by fast changes in the light-gated conductance. On the other hand, the slower bump latency distribution is likely to represent slower enzymatic intracellular reactions. Furthermore, the Q10s of bump duration and latency distribution depended on light intensity. Altogether, this study suggests that biochemical constraints imposed upon signaling change continuously as locust photoreceptors adapt to environmental light and temperature conditions

    Second Order Dimensionality Reduction Using Minimum and Maximum Mutual Information Models

    Get PDF
    Conventional methods used to characterize multidimensional neural feature selectivity, such as spike-triggered covariance (STC) or maximally informative dimensions (MID), are limited to Gaussian stimuli or are only able to identify a small number of features due to the curse of dimensionality. To overcome these issues, we propose two new dimensionality reduction methods that use minimum and maximum information models. These methods are information theoretic extensions of STC that can be used with non-Gaussian stimulus distributions to find relevant linear subspaces of arbitrary dimensionality. We compare these new methods to the conventional methods in two ways: with biologically-inspired simulated neurons responding to natural images and with recordings from macaque retinal and thalamic cells responding to naturalistic time-varying stimuli. With non-Gaussian stimuli, the minimum and maximum information methods significantly outperform STC in all cases, whereas MID performs best in the regime of low dimensional feature spaces

    Effect of sampling effort and sampling frequency on the composition of the planktonic crustacean assemblage: a case study of the river Danube

    Get PDF
    Although numerous studies have focused on the seasonal dynamics of riverine zooplankton, little is known about its short-term variation. In order to examine the effects of sampling frequency and sampling effort, microcrustacean samples were collected at daily intervals between 13 June and 21 July of 2007 in a parapotamal side arm of the river Danube, Hungary. Samples were also taken at biweekly intervals from November 2006 to May 2008. After presenting the community dynamics, the effect of sampling effort was evaluated with two different methods; the minimal sample size was also estimated. We introduced a single index (potential dynamic information loss; to determine the potential loss of information when sampling frequency is reduced. The formula was calculated for the total abundance, densities of the dominant taxa, adult/larva ratios of copepods and for two different diversity measures. Results suggest that abundances may experience notable fluctuations even within 1 week, as do diversities and adult/larva ratios

    Modeling convergent ON and OFF pathways in the early visual system

    Get PDF
    For understanding the computation and function of single neurons in sensory systems, one needs to investigate how sensory stimuli are related to a neuron’s response and which biological mechanisms underlie this relationship. Mathematical models of the stimulus–response relationship have proved very useful in approaching these issues in a systematic, quantitative way. A starting point for many such analyses has been provided by phenomenological “linear–nonlinear” (LN) models, which comprise a linear filter followed by a static nonlinear transformation. The linear filter is often associated with the neuron’s receptive field. However, the structure of the receptive field is generally a result of inputs from many presynaptic neurons, which may form parallel signal processing pathways. In the retina, for example, certain ganglion cells receive excitatory inputs from ON-type as well as OFF-type bipolar cells. Recent experiments have shown that the convergence of these pathways leads to intriguing response characteristics that cannot be captured by a single linear filter. One approach to adjust the LN model to the biological circuit structure is to use multiple parallel filters that capture ON and OFF bipolar inputs. Here, we review these new developments in modeling neuronal responses in the early visual system and provide details about one particular technique for obtaining the required sets of parallel filters from experimental data

    Information Transmission in Cercal Giant Interneurons Is Unaffected by Axonal Conduction Noise

    Get PDF
    What are the fundamental constraints on the precision and accuracy with which nervous systems can process information? One constraint must reflect the intrinsic “noisiness” of the mechanisms that transmit information between nerve cells. Most neurons transmit information through the probabilistic generation and propagation of spikes along axons, and recent modeling studies suggest that noise from spike propagation might pose a significant constraint on the rate at which information could be transmitted between neurons. However, the magnitude and functional significance of this noise source in actual cells remains poorly understood. We measured variability in conduction time along the axons of identified neurons in the cercal sensory system of the cricket Acheta domesticus, and used information theory to calculate the effects of this variability on sensory coding. We found that the variability in spike propagation speed is not large enough to constrain the accuracy of neural encoding in this system
    corecore