569 research outputs found
Erratum: Relative sound localisation abilities in human listeners [J. Acoust. Soc. Am. 138, 674–686 (2015)]
Where are multisensory signals combined for perceptual decision-making?
Multisensory integration is observed in many subcortical and cortical locations including primary and non-primary sensory cortex, and higher cortical areas including frontal and parietal cortex. During unisensory perceptual tasks many of these same brain areas show neural signatures associated with decision-making. It is unclear whether multisensory representations in sensory cortex directly inform decision-making in a multisensory task, or if cross-modal signals are only combined after the accumulation of unisensory evidence at a final decision-making stage in higher cortical areas. Manipulations of neuronal activity are required to establish causal roles for given brain regions in multisensory perceptual decision-making, and so far indicate that distributed networks underlie multisensory decision-making. Understanding multisensory integration requires synthesis of small-scale pathway specific and large-scale network level manipulations
Recommended from our members
Thermodynamic and kinetic properties of interpolymer complexes assessed by isothermal titration calorimetry and surface plasmon resonance
Interpolymer complexes (IPCs) formed between complimentary polymers in solution have shown a wide range of applications from drug delivery to biosensors. This work describes the combined use of isothermal titration calorimetry and surface plasmon resonance to investigate the thermodynamic and kinetic processes during hydrogen-bonded interpolymer complexation. Varied polymers that are commonly used in layer-by-layer coatings and pharmaceutical preparations were selected to span a range of chemical functionalities including some known IPCs previously characterized by other techniques, and other polymer combinations with unknown outcomes. This work is the first to comprehensively detail the thermodynamic and kinetic data of hydrogen bonded IPCs, aiding understanding and detailed characterization of the complexes. The applicability of the two techniques in determining thermodynamic, gravimetric and kinetic properties of IPCs is considered
Relative sound localisation abilities in human listeners
Spatial acuity varies with sound-source azimuth, signal-to-noise ratio, and the spectral characteristics of the sound source. Here, the spatial localisation abilities of listeners were assessed using a relative localisation task. This task tested localisation ability at fixed angular separations throughout space using a two-alternative forced-choice design across a variety of listening conditions. Subjects were required to determine whether a target sound originated to the left or right of a preceding reference in the presence of a multi-source noise background. Experiment 1 demonstrated that subjects' ability to determine the relative location of two sources declined with less favourable signal-to-noise ratios and at peripheral locations. Experiment 2 assessed performance with both broadband and spectrally restricted stimuli designed to limit localisation cues to predominantly interaural level differences or interaural timing differences (ITDs). Predictions generated from topographic, modified topographic, and two-channel models of sound localisation suggest that for low-pass stimuli, where ITD cues were dominant, the two-channel model provides an adequate description of the experimental data, whereas for broadband and high frequency bandpass stimuli none of the models was able to fully account for performance. Experiment 3 demonstrated that relative localisation performance was uninfluenced by shifts in gaze direction
Integration of Visual Information in Auditory Cortex Promotes Auditory Scene Analysis through Multisensory Binding
How and where in the brain audio-visual signals are bound to create multimodal objects remains unknown. One hypothesis is that temporal coherence between dynamic multisensory signals provides a mechanism for binding stimulus features across sensory modalities. Here, we report that when the luminance of a visual stimulus is temporally coherent with the amplitude fluctuations of one sound in a mixture, the representation of that sound is enhanced in auditory cortex. Critically, this enhancement extends to include both binding and non-binding features of the sound. We demonstrate that visual information conveyed from visual cortex via the phase of the local field potential is combined with auditory information within auditory cortex. These data provide evidence that early cross-sensory binding provides a bottom-up mechanism for the formation of cross-sensory objects and that one role for multisensory binding in auditory cortex is to support auditory scene analysis
Auditory Neuroscience: Unravelling How the Brain Gives Sound Meaning
The brain must be able to assign sounds in the world to behaviourally meaningful categories. A new study has revealed that sensory pathways represent category information, but that selectivity for sound classes emerges first in the frontal cortex
Egocentric and allocentric representations in auditory cortex
A key function of the brain is to provide a stable representation of an object’s location in the world. In hearing, sound azimuth and elevation are encoded by neurons throughout the auditory system, and auditory cortex is necessary for sound localization. However, the coordinate frame in which neurons represent sound space remains undefined: classical spatial receptive fields in head-fixed subjects can be explained either by sensitivity to sound source location relative to the head (egocentric) or relative to the world (allocentric encoding). This coordinate frame ambiguity can be resolved by studying freely moving subjects; here we recorded spatial receptive fields in the auditory cortex of freely moving ferrets. We found that most spatially tuned neurons represented sound source location relative to the head across changes in head position and direction. In addition, we also recorded a small number of neurons in which sound location was represented in a world-centered coordinate frame. We used measurements of spatial tuning across changes in head position and direction to explore the influence of sound source distance and speed of head movement on auditory cortical activity and spatial tuning. Modulation depth of spatial tuning increased with distance for egocentric but not allocentric units, whereas, for both populations, modulation was stronger at faster movement speeds. Our findings suggest that early auditory cortex primarily represents sound source location relative to ourselves but that a minority of cells can represent sound location in the world independent of our own position
Non-auditory processing in the central auditory pathway
Multisensory responses have been observed throughout the central auditory pathway, yet the origin, function and perceptual consequences of cross-modal integration remain unresolved. Recent studies have applied modern neuroanatomical and functional perturbation techniques to dissect the circuits that might enable multisensory information to sculpt the processing of sound. These highlight in particular the role that subcortical pathways might play in relaying multisensory information to, and between, sensory cortical fields. We also examine the consequences of integrating non-auditory information into auditory processing, identifying key areas where this may be critical for successful listening and potential roles for visual information in augmenting auditory scene analysis, and for non-auditory information in facilitating coordinate frame transformations
Physiology of Higher Central Auditory Processing and Plasticity
Binaural cue processing requires central auditory function as damage to the auditory cortex and other cortical regions impairs sound localization. Sound localization cues are initially extracted by brainstem nuclei, but how the cerebral cortex supports spatial sound perception remains unclear. This chapter reviews the evidence that spatial encoding within and beyond the auditory cortex supports sound localization, including the integration of information across sound frequencies and localization cues. In particular, this chapter discusses the role of brain regions across the cerebral cortex that may be specialized for extracting and transforming the spatial aspects of sounds and extends from sensory to parietal and prefrontal cortices. The chapter considers how the encoding of spatial information changes with attention and how spatial processing fits within the broader context of auditory scene analysis by cortical networks. The importance of neural plasticity in binaural processing is outlined, including a discussion of how changes in the mapping of localization cues to spatial position allow listeners to adapt to changes in auditory input throughout life and after hearing loss. The chapter ends by summarizing some of the open questions about the central processing of binaural cues and how they may be answered
- …
