29 research outputs found

    Neural processing of natural sounds

    Full text link
    Natural sounds include animal vocalizations, environmental sounds such as wind, water and fire noises and non-vocal sounds made by animals and humans for communication. These natural sounds have characteristic statistical properties that make them perceptually salient and that drive auditory neurons in optimal regimes for information transmission.Recent advances in statistics and computer sciences have allowed neuro-physiologists to extract the stimulus-response function of complex auditory neurons from responses to natural sounds. These studies have shown a hierarchical processing that leads to the neural detection of progressively more complex natural sound features and have demonstrated the importance of the acoustical and behavioral contexts for the neural responses.High-level auditory neurons have shown to be exquisitely selective for conspecific calls. This fine selectivity could play an important role for species recognition, for vocal learning in songbirds and, in the case of the bats, for the processing of the sounds used in echolocation. Research that investigates how communication sounds are categorized into behaviorally meaningful groups (e.g. call types in animals, words in human speech) remains in its infancy.Animals and humans also excel at separating communication sounds from each other and from background noise. Neurons that detect communication calls in noise have been found but the neural computations involved in sound source separation and natural auditory scene analysis remain overall poorly understood. Thus, future auditory research will have to focus not only on how natural sounds are processed by the auditory system but also on the computations that allow for this processing to occur in natural listening situations.The complexity of the computations needed in the natural hearing task might require a high-dimensional representation provided by ensemble of neurons and the use of natural sounds might be the best solution for understanding the ensemble neural code

    Integrating information from different senses in the auditory cortex

    No full text
    © 2012, The Author(s). Multisensory integration was once thought to be the domain of brain areas high in the cortical hierarchy, with early sensory cortical fields devoted to unisensory processing of inputs from their given set of sensory receptors. More recently, a wealth of evidence documenting visual and somatosensory responses in auditory cortex, even as early as the primary fields, has changed this view of cortical processing. These multisensory inputs may serve to enhance responses to sounds that are accompanied by other sensory cues, effectively making them easier to hear, but may also act more selectively to shape the receptive field properties of auditory cortical neurons to the location or identity of these events. We discuss the new, converging evidence that multiplexing of neural signals may play a key role in informatively encoding and integrating signals in auditory cortex across multiple sensory modalities. We highlight some of the many open research questions that exist about the neural mechanisms that give rise to multisensory integration in auditory cortex, which should be addressed in future experimental and theoretical studies

    Local and global spatial organization of interaural level difference and frequency preferences in auditory cortex

    No full text
    Despite decades of microelectrode recordings, fundamental questions remain about how auditory cortex represents sound-source location. Here, we used in vivo two- photon calcium imaging to measure the sensitivity of layer II/III neurons in mouse primary auditory cortex (A1) to interaural level differences (ILDs), the principal spatial cue in this species. Although most ILD-sensitive neurons preferred ILDs favoring the contralateral ear, neurons with either midline or ipsilateral preferences were also present. An opponent-channel decoder accurately classified ILDs using the difference in responses between populations of neurons that preferred contralateral- ear-greater and ipsilateral-ear-greater stimuli. We also examined the spatial organization of binaural tuning properties across the imaged neurons with unprecedented resolution. Neurons driven exclusively by contralateral ear stimuli or by binaural stimulation occasionally formed local clusters, but their binaural categories and ILD preferences were not spatially organized on a more global scale. In contrast, the sound frequency preferences of most neurons within local cortical regions fell within a restricted frequency range, and a tonotopic gradient was observed across the cortical surface of individual mice. These results indicate that the representation of ILDs in mouse auditory cortex is comparable to that of most other mammalian species, and appears to lack systematic or consistent spatial order

    Local and global spatial organization of interaural level difference and frequency preferences in auditory cortex

    No full text
    Despite decades of microelectrode recordings, fundamental questions remain about how auditory cortex represents sound-source location. Here, we used in vivo two- photon calcium imaging to measure the sensitivity of layer II/III neurons in mouse primary auditory cortex (A1) to interaural level differences (ILDs), the principal spatial cue in this species. Although most ILD-sensitive neurons preferred ILDs favoring the contralateral ear, neurons with either midline or ipsilateral preferences were also present. An opponent-channel decoder accurately classified ILDs using the difference in responses between populations of neurons that preferred contralateral- ear-greater and ipsilateral-ear-greater stimuli. We also examined the spatial organization of binaural tuning properties across the imaged neurons with unprecedented resolution. Neurons driven exclusively by contralateral ear stimuli or by binaural stimulation occasionally formed local clusters, but their binaural categories and ILD preferences were not spatially organized on a more global scale. In contrast, the sound frequency preferences of most neurons within local cortical regions fell within a restricted frequency range, and a tonotopic gradient was observed across the cortical surface of individual mice. These results indicate that the representation of ILDs in mouse auditory cortex is comparable to that of most other mammalian species, and appears to lack systematic or consistent spatial order

    Development of perceptual correlates of reading performance.

    No full text
    Performance on perceptual tasks requiring the discrimination of brief, temporally proximate or temporally varying sensory stimuli (temporal processing tasks) is impaired in some individuals with developmental language disorder and/or dyslexia. Little is known about how these temporal processes in perception develop and how they relate to language and reading performance in the normal population. The present study examined performance on 8 temporal processing tasks and 5 language/reading tasks in 120 unselected readers who varied in age over a range in which reading and phonological awareness were developing. Performance on all temporal processing tasks except coherent motion detection improved over ages 7 years to adulthood (p<0.01), especially between ages 7 and 13 years. Independent of these age effects, performance on all 8 temporal processing tasks predicted phonological awareness and reading performance (p<0.05), and three auditory temporal processing tasks predicted receptive language function (p<0.05). Furthermore, all temporal processing measures except within-channel gap detection and coherent motion detection predicted unique variance in phonological scores within subjects, whereas only within-channel gap detection performance explained unique variance in orthographic reading performance. These findings partially support the (Farmer, M.E., Klein, R.M., 1995. The evidence for a temporal processing deficit linked to dyslexia: A review. Psychon. Bull. Rev. 2, 460-493) notion of there being separable auditory and visual perceptual contributions to phonological and orthographic reading development. The data also are compatible with the view that the umbrella term "temporal processing" encompasses fundamentally different sensory or cognitive processes that may contribute differentially to language and reading performance, which may have different developmental trajectories and be differentially susceptible to pathology

    Across-species differences in pitch perception are consistent with differences in cochlear filtering

    No full text
    Pitch perception is critical for recognizing speech, music and animal vocalizations, but its neurobiological basis remains unsettled, in part because of divergent results across species. We investigated whether species-specific differences exist in the cues used to perceive pitch and whether these can be accounted for by differences in the auditory periphery. Ferrets accurately generalized pitch discriminations to untrained stimuli whenever temporal envelope cues were robust in the probe sounds, but not when resolved harmonics were the main available cue. By contrast, human listeners exhibited the opposite pattern of results on an analogous task, consistent with previous studies. Simulated cochlear responses in the two species suggest that differences in the relative salience of the two pitch cues can be attributed to differences in cochlear filter bandwidths. The results support the view that cross-species variation in pitch perception reflects the constraints of estimating a sound’s fundamental frequency given species-specific cochlear tuning.</jats:p

    Cortical adaptation to sound reverberation

    No full text
    In almost every natural environment, sounds are reflected by nearby objects, producing many delayed and distorted copies of the original sound, known as reverberation. Our brains usually cope well with reverberation, allowing us to recognize sound sources regardless of their environments. In contrast, reverberation can cause severe difficulties for speech recognition algorithms and hearing-impaired people. The present study examines how the auditory system copes with reverberation. We trained a linear model to recover a rich set of natural, anechoic sounds from their simulated reverberant counterparts. The model neurons achieved this by extending the inhibitory component of their receptive filters for more reverberant spaces, and did so in a frequency-dependent manner. These predicted effects were observed in the responses of auditory cortical neurons of ferrets in the same simulated reverberant environments. Together, these results suggest that auditory cortical neurons adapt to reverberation by adjusting their filtering properties in a manner consistent with dereverberation

    Complexity of frequency receptive fields predicts tonotopic variability across species

    No full text
    Primary cortical areas contain maps of sensory features, including sound frequency in primary auditory cortex (A1). Two-photon calcium imaging in mice has confirmed the presence of these global tonotopic maps, while uncovering an unexpected local variability in the stimulus preferences of individual neurons in A1 and other primary regions. Here we show that local heterogeneity of frequency preferences is not unique to rodents. Using two-photon calcium imaging in layers 2/3, we found that local variance in frequency preferences is equivalent in ferrets and mice. Neurons with multipeaked frequency tuning are less spatially organized than those tuned to a single frequency in both species. Furthermore, we show that microelectrode recordings may describe a smoother tonotopic arrangement due to a sampling bias towards neurons with simple frequency tuning. These results help explain previous inconsistencies in cortical topography across species and recording techniques

    Temporal processing performance, reading performance, and auditory processing disorder in learning-impaired children and controls

    No full text
    This paper examines the relations between temporal processing and reading performance by comparing the performance of 38 children with learning impairments (LI) to 32 age-matched, typically developing subjects (controls) on these tasks. Subjects were tested on four auditory and four visual temporal processing tasks, and four language/reading tasks. Subjects in the LI group were also tested for auditory processing disorder (APD). Kruskal-Wallis tests and Spearman correlation coefficients were used to evaluate the differences and relations between group test scores (alpha = 0.05, Bonferroni corrected). LI subjects performed more poorly than controls on reading and phonological awareness tasks, as well as on the subset of temporal processing tasks that required the relative timing of two stimulus events. There was a trend for performance on language/reading and several auditory temporal processing tasks to drop from control subjects, to those with LI alone, to those with both APD and LI. Scores on a subset of relative timing tasks were positively correlated with reading scores for controls, but not LI subjects. The results suggest that relative timing judgements of auditory and visual stimuli, rather than the identification of a single, brief stimulus event, may play a key role in reading development
    corecore