16,305 research outputs found

    Neural population coding: combining insights from microscopic and mass signals

    Get PDF
    Behavior relies on the distributed and coordinated activity of neural populations. Population activity can be measured using multi-neuron recordings and neuroimaging. Neural recordings reveal how the heterogeneity, sparseness, timing, and correlation of population activity shape information processing in local networks, whereas neuroimaging shows how long-range coupling and brain states impact on local activity and perception. To obtain an integrated perspective on neural information processing we need to combine knowledge from both levels of investigation. We review recent progress of how neural recordings, neuroimaging, and computational approaches begin to elucidate how interactions between local neural population activity and large-scale dynamics shape the structure and coding capacity of local information representations, make them state-dependent, and control distributed populations that collectively shape behavior

    Timescale-invariant representation of acoustic communication signals by a bursting neuron

    Get PDF
    Acoustic communication often involves complex sound motifs in which the relative durations of individual elements, but not their absolute durations, convey meaning. Decoding such signals requires an explicit or implicit calculation of the ratios between time intervals. Using grasshopper communication as a model, we demonstrate how this seemingly difficult computation can be solved in real time by a small set of auditory neurons. One of these cells, an ascending interneuron, generates bursts of action potentials in response to the rhythmic syllable-pause structure of grasshopper calls. Our data show that these bursts are preferentially triggered at syllable onset; the number of spikes within the burst is linearly correlated with the duration of the preceding pause. Integrating the number of spikes over a fixed time window therefore leads to a total spike count that reflects the characteristic syllable-to-pause ratio of the species while being invariant to playing back the call faster or slower. Such a timescale-invariant recognition is essential under natural conditions, because grasshoppers do not thermoregulate; the call of a sender sitting in the shade will be slower than that of a grasshopper in the sun. Our results show that timescale-invariant stimulus recognition can be implemented at the single-cell level without directly calculating the ratio between pulse and interpulse durations

    Neurons with stereotyped and rapid responses provide a reference frame for relative temporal coding in primate auditory cortex

    Get PDF
    The precise timing of spikes of cortical neurons relative to stimulus onset carries substantial sensory information. To access this information the sensory systems would need to maintain an internal temporal reference that reflects the precise stimulus timing. Whether and how sensory systems implement such reference frames to decode time-dependent responses, however, remains debated. Studying the encoding of naturalistic sounds in primate (Macaca mulatta) auditory cortex we here investigate potential intrinsic references for decoding temporally precise information. Within the population of recorded neurons, we found one subset responding with stereotyped fast latencies that varied little across trials or stimuli, while the remaining neurons had stimulus-modulated responses with longer and variable latencies. Computational analysis demonstrated that the neurons with stereotyped short latencies constitute an effective temporal reference for relative coding. Using the response onset of a simultaneously recorded stereotyped neuron allowed decoding most of the stimulus information carried by onset latencies and the full spike train of stimulus-modulated neurons. Computational modeling showed that few tens of such stereotyped reference neurons suffice to recover nearly all information that would be available when decoding the same responses relative to the actual stimulus onset. These findings reveal an explicit neural signature of an intrinsic reference for decoding temporal response patterns in the auditory cortex of alert animals. Furthermore, they highlight a role for apparently unselective neurons as an early saliency signal that provides a temporal reference for extracting stimulus information from other neurons

    Representation of Time-Varying Stimuli by a Network Exhibiting Oscillations on a Faster Time Scale

    Get PDF
    Sensory processing is associated with gamma frequency oscillations (30–80 Hz) in sensory cortices. This raises the question whether gamma oscillations can be directly involved in the representation of time-varying stimuli, including stimuli whose time scale is longer than a gamma cycle. We are interested in the ability of the system to reliably distinguish different stimuli while being robust to stimulus variations such as uniform time-warp. We address this issue with a dynamical model of spiking neurons and study the response to an asymmetric sawtooth input current over a range of shape parameters. These parameters describe how fast the input current rises and falls in time. Our network consists of inhibitory and excitatory populations that are sufficient for generating oscillations in the gamma range. The oscillations period is about one-third of the stimulus duration. Embedded in this network is a subpopulation of excitatory cells that respond to the sawtooth stimulus and a subpopulation of cells that respond to an onset cue. The intrinsic gamma oscillations generate a temporally sparse code for the external stimuli. In this code, an excitatory cell may fire a single spike during a gamma cycle, depending on its tuning properties and on the temporal structure of the specific input; the identity of the stimulus is coded by the list of excitatory cells that fire during each cycle. We quantify the properties of this representation in a series of simulations and show that the sparseness of the code makes it robust to uniform warping of the time scale. We find that resetting of the oscillation phase at stimulus onset is important for a reliable representation of the stimulus and that there is a tradeoff between the resolution of the neural representation of the stimulus and robustness to time-warp. Author Summary Sensory processing of time-varying stimuli, such as speech, is associated with high-frequency oscillatory cortical activity, the functional significance of which is still unknown. One possibility is that the oscillations are part of a stimulus-encoding mechanism. Here, we investigate a computational model of such a mechanism, a spiking neuronal network whose intrinsic oscillations interact with external input (waveforms simulating short speech segments in a single acoustic frequency band) to encode stimuli that extend over a time interval longer than the oscillation's period. The network implements a temporally sparse encoding, whose robustness to time warping and neuronal noise we quantify. To our knowledge, this study is the first to demonstrate that a biophysically plausible model of oscillations occurring in the processing of auditory input may generate a representation of signals that span multiple oscillation cycles.National Science Foundation (DMS-0211505); Burroughs Wellcome Fund; U.S. Air Force Office of Scientific Researc

    A unified coding strategy for processing faces and voices

    Get PDF
    Both faces and voices are rich in socially-relevant information, which humans are remarkably adept at extracting, including a person's identity, age, gender, affective state, personality, etc. Here, we review accumulating evidence from behavioral, neuropsychological, electrophysiological, and neuroimaging studies which suggest that the cognitive and neural processing mechanisms engaged by perceiving faces or voices are highly similar, despite the very different nature of their sensory input. The similarity between the two mechanisms likely facilitates the multi-modal integration of facial and vocal information during everyday social interactions. These findings emphasize a parsimonious principle of cerebral organization, where similar computational problems in different modalities are solved using similar solutions

    Pre-stimulus influences on auditory perception arising from sensory representations and decision processes

    Get PDF
    The qualities of perception depend not only on the sensory inputs but also on the brain state before stimulus presentation. Although the collective evidence from neuroimaging studies for a relation between prestimulus state and perception is strong, the interpretation in the context of sensory computations or decision processes has remained difficult. In the auditory system, for example, previous studies have reported a wide range of effects in terms of the perceptually relevant frequency bands and state parameters (phase/power). To dissociate influences of state on earlier sensory representations and higher-level decision processes, we collected behavioral and EEG data in human participants performing two auditory discrimination tasks relying on distinct acoustic features. Using single-trial decoding, we quantified the relation between prestimulus activity, relevant sensory evidence, and choice in different task-relevant EEG components. Within auditory networks, we found that phase had no direct influence on choice, whereas power in task-specific frequency bands affected the encoding of sensory evidence. Within later-activated frontoparietal regions, theta and alpha phase had a direct influence on choice, without involving sensory evidence. These results delineate two consistent mechanisms by which prestimulus activity shapes perception. However, the timescales of the relevant neural activity depend on the specific brain regions engaged by the respective task

    Impaired Auditory Temporal Selectivity in the Inferior Colliculus of Aged Mongolian Gerbils

    Get PDF
    Aged humans show severe difficulties in temporal auditory processing tasks (e.g., speech recognition in noise, low-frequency sound localization, gap detection). A degradation of auditory function with age is also evident in experimental animals. To investigate age-related changes in temporal processing, we compared extracellular responses to temporally variable pulse trains and human speech in the inferior colliculus of young adult (3 month) and aged (3 years) Mongolian gerbils. We observed a significant decrease of selectivity to the pulse trains in neuronal responses from aged animals. This decrease in selectivity led, on the population level, to an increase in signal correlations and therefore a decrease in heterogeneity of temporal receptive fields and a decreased efficiency in encoding of speech signals. A decrease in selectivity to temporal modulations is consistent with a downregulation of the inhibitory transmitter system in aged animals. These alterations in temporal processing could underlie declines in the aging auditory system, which are unrelated to peripheral hearing loss. These declines cannot be compensated by traditional hearing aids (that rely on amplification of sound) but may rather require pharmacological treatment

    Neural codes formed by small and temporally precise populations in auditory cortex

    Get PDF
    The encoding of sensory information by populations of cortical neurons forms the basis for perception but remains poorly understood. To understand the constraints of cortical population coding we analyzed neural responses to natural sounds recorded in auditory cortex of primates (Macaca mulatta). We estimated stimulus information while varying the composition and size of the considered population. Consistent with previous reports we found that when choosing subpopulations randomly from the recorded ensemble, the average population information increases steadily with population size. This scaling was explained by a model assuming that each neuron carried equal amounts of information, and that any overlap between the information carried by each neuron arises purely from random sampling within the stimulus space. However, when studying subpopulations selected to optimize information for each given population size, the scaling of information was strikingly different: a small fraction of temporally precise cells carried the vast majority of information. This scaling could be explained by an extended model, assuming that the amount of information carried by individual neurons was highly nonuniform, with few neurons carrying large amounts of information. Importantly, these optimal populations can be determined by a single biophysical marker—the neuron's encoding time scale—allowing their detection and readout within biologically realistic circuits. These results show that extrapolations of population information based on random ensembles may overestimate the population size required for stimulus encoding, and that sensory cortical circuits may process information using small but highly informative ensembles
    • …
    corecore