20 research outputs found

    Optimal local estimates of visual motion in a natural environment

    Full text link
    Many organisms, from flies to humans, use visual signals to estimate their motion through the world. To explore the motion estimation problem, we have constructed a camera/gyroscope system that allows us to sample, at high temporal resolution, the joint distribution of input images and rotational motions during a long walk in the woods. From these data we construct the optimal estimator of velocity based on spatial and temporal derivatives of image intensity in small patches of the visual world. Over the bulk of the naturally occurring dynamic range, the optimal estimator exhibits the same systematic errors seen in neural and behavioral responses, including the confounding of velocity and contrast. These results suggest that apparent errors of sensory processing may reflect an optimal response to the physical signals in the environment

    Universal Statistical Behavior of Neural Spike Trains

    Get PDF
    We construct a model that predicts the statistical properties of spike trains generated by a sensory neuron. The model describes the combined effects of the neuron's intrinsic properties, the noise in the surrounding, and the external driving stimulus. We show that the spike trains exhibit universal statistical behavior over short times, modulated by a strongly stimulus-dependent behavior over long times. These predictions are confirmed in experiments on H1, a motion-sensitive neuron in the fly visual system.Comment: 7 pages, 4 figure

    Entropy and information in neural spike trains: Progress on the sampling problem

    Full text link
    The major problem in information theoretic analysis of neural responses and other biological data is the reliable estimation of entropy--like quantities from small samples. We apply a recently introduced Bayesian entropy estimator to synthetic data inspired by experiments, and to real experimental spike trains. The estimator performs admirably even very deep in the undersampled regime, where other techniques fail. This opens new possibilities for the information theoretic analysis of experiments, and may be of general interest as an example of learning from limited data.Comment: 7 pages, 4 figures; referee suggested changes, accepted versio

    Optimal local estimates of visual motion in a natural environment

    No full text
    Many organisms, from flies to humans, use visual signals to estimate their motion through the world. To explore the motion estimation problem, we have constructed a camera/gyroscope system that allows us to sample, at high temporal resolution, the joint distribution of input images and rotational motions during a long walk in the woods. From these data we construct the optimal estimator of velocity based on spatial and temporal derivatives of image intensity in small patches of the visual world. Over the bulk of the naturally occurring dynamic range, the optimal estimator exhibits the same systematic errors seen in neural and behavioral responses, including the confounding of velocity and contrast. These results suggest that apparent errors of sensory processing may reflect an optimal response to the physical signals in the environment

    The metabolic cost of neural information.

    No full text
    Neural processing is metabolically expensive. The human brain accounts for 20% of resting oxygen consumption, and half of this energy drives the pumps that exchange sodium and potassium ions across cell membranes 1 . Because these pumps are maintaining the ionic concentration gradients that power electrical signaling by neurons, 10% of a resting human's energy is used to keep the brain's batteries charged. In the rabbit retina, second messenger systems, synapses and ion pumps make large contributions to a high metabolic rate 2,3 . Therefore, when the basic cellular mechanisms for signaling and information processing are concentrated in brains and sense organs, the metabolic demands are considerable. These metabolic demands could be large enough to influence the design, function and evolution of brains and behavior. Comparative studies suggest that the metabolic expense of maintaining the brain throughout life 4 , or the demands made by the developing brain on the maternal energy budget 5 , have limited the sizes of primate brains. The human brain's susceptibility to anoxia and its precise local regulation of cerebral blood flow also suggest that the supply of energy limits neural function. If metabolic energy is limiting, then neurons, neural codes and neural circuits will have evolved to reduce metabolic demands. Two elegant theoretical analyses show that metabolic efficiency can profoundly influence neural coding. The minimization of metabolic cost promotes the distribution of signals over a population of weakly active cells Although metabolic energy is clearly important in determining neural function, we lack basic data on the quantitative relationships between energy and information in nervous systems. Precisely how much energy must a neuron consume to do a given amount of useful work, transmitting and processing information? How does energy consumption scale with the quantity of information that neurons handle? We can now address these fundamental questions because we have recently measured the quantities of information transmitted by photoreceptors and interneurons of the intact blowfly retina 8 and can use biophysical data to estimate the amount of energy required to transmit these signals. We find that information is expensive, and that, for a given communication channel, the cost per bit increases with bit rate. Thus metabolic cost can have a profound influence on the structure, function and evolution of cell signaling systems, neurons, neural circuits and neural codes. Results THE METABOLIC COST OF INFORMATION IN A PHOTORECEPTOR Information transmission rate, measured in bits per second, is a useful measure of the neural work done by photoreceptors and interneurons of the fly compound eye, for the following reasons. Increasing the number of bits transmitted per cell improves the retinal image by increasing the number of gray levels coded per second per pixel. A number of studies conclusively demonstrate that the large monopolar cell (LMC), the second-order retinal neuron, is optimized to maximize bit rate 9 . We have recently measured the rates at which retinal cells transmit information under daylight conditions 8 . Cells were driven by randomly modulating the light intensity of an LED We estimated I = 1000 bits per second for the fully light-adapted cell, but we expect lower rates under natural conditions We derive experimentally based estimates of the energy used by neural mechanisms to code known quantities of information. Biophysical measurements from cells in the blowfly retina yield estimates of the ATP required to generate graded (analog) electrical signals that transmit known amounts of information. Energy consumption is several orders of magnitude greater than the thermodynamic minimum. It costs 10 4 ATP molecules to transmit a bit at a chemical synapse, and 10 6 -10 7 ATP for graded signals in an interneuron or a photoreceptor, or for spike coding. Therefore, in noise-limited signaling systems, a weak pathway of low capacity transmits information more economically, which promotes the distribution of information among multiple pathways. ( 1
    corecore