30,835 research outputs found
The neural basis of auditory temporal discrimination in girls with fragile X syndrome
Fragile X syndrome (FXS) is a common genetic disorder in which temporal processing may be impaired. To our knowledge however, no studies have examined the neural basis of temporal discrimination in individuals with FXS using functional magnetic resonance imaging (fMRI). Ten girls with fragile X syndrome and ten developmental age-matched typically developing controls performed an auditory temporal discrimination task in a 3T scanner. Girls with FXS showed significantly greater brain activation in a left-lateralized network, comprising left medial frontal gyrus, left superior and middle temporal gyrus, left cerebellum, and left brainstem (pons), when compared to a developmental age-matched typically developing group of subjects who had similar in-scanner task performance. There were no regions that showed significantly greater brain activation in the control group compared to individuals with FXS. These data indicate that networks of brain regions involved in auditory temporal processing may be dysfunctional in FXS. In particular, it is possible that girls with FXS employ left hemispheric resources to overcompensate for relative right hemispheric dysfunction
Global timing: a conceptual framework to investigate the neural basis of rhythm perception in humans and non-human species
Timing cues are an essential feature of music. To understand how the brain gives rise to our experience of music we must appreciate how acoustical temporal patterns are integrated over the range of several seconds in order to extract global timing. In music perception, global timing comprises three distinct but often interacting percepts: temporal grouping, beat, and tempo. What directions may we take to further elucidate where and how the global timing of music is processed in the brain? The present perspective addresses this question and describes our current understanding of the neural basis of global timing perception
Recommended from our members
Hearing through your eyes: neural basis of audiovisual cross-activation, revealed by transcranial alternating current stimulation
Some people experience auditory sensations when seeing visual flashes or movements. This prevalent synaesthesia-like ‘visual-evoked auditory response’ (vEAR) could result either from over-exuberant cross-activation between brain areas, and/or reduced inhibition of normally-occurring cross-activation. We have used transcranial alternating current stimulation (tACS) to test these theories. We applied tACS at 10Hz (alpha-band frequency) or 40Hz (gamma-band), bilaterally either to temporal or occipital sites, while measuring same/different discrimination of paired auditory (A) versus visual (V) 'Morse code' sequences. At debriefing, participants were classified as vEAR or non-vEAR depending on whether they reported 'hearing' the silent flashes.
In non-vEAR participants, temporal 10Hz tACS caused impairment of A performance, which correlated with improved V; conversely under occipital tACS, poorer V performance correlated with improved A. This reciprocal pattern suggests that sensory cortices are normally mutually inhibitory, and that alpha-frequency tACS may bias the balance of competition between them. vEAR participants showed no tACS effects, consistent with reduced inhibition, or enhanced cooperation between modalities. In addition, temporal 40Hz tACS impaired V performance, specifically in individuals who showed a performance advantage for V (relative to A). Gamma-frequency tACS may therefore modulate the ability of these individuals to benefit from recoding flashes into the auditory modality, possibly by disrupting cross-activation of auditory areas by visual stimulation.
Our results support both theories, suggesting that vEAR may depend on disinhibition of normally-occurring sensory cross-activation, which may be expressed more strongly in some individuals. Furthermore, endogenous alpha and gamma-frequency oscillations may function respectively to inhibit or promote this cross-activation
Representation of Sound Categories in Auditory Cortical Maps
We used functional magnetic resonance imaging (fMRI) to investigate the representation of sound categories in human auditory cortex. Experiment 1 investigated the representation of prototypical and non-prototypical examples of a vowel sound. Listening to prototypical examples of a vowel resulted in less auditory cortical activation than listening to nonprototypical examples. Experiments 2 and 3 investigated the effects of categorization training and discrimination training with novel non-speech sounds on auditory cortical representations. The two training tasks were shown to have opposite effects on the auditory cortical representation of sounds experienced during training: discrimination training led to an increase in the amount of activation caused by the training stimuli, whereas categorization training led to decreased activation. These results indicate that the brain efficiently shifts neural resources away from regions of acoustic space where discrimination between sounds is not behaviorally important (e.g., near the center of a sound category) and toward regions where accurate discrimination is needed. The results also provide a straightforward neural account of learned aspects of categorical perception: sounds from the center of a category are more difficult to discriminate from each other than sounds near category boundaries because they are represented by fewer cells in the auditory cortical areas.National Institute on Deafness and Other Communication Disorders (R01 DC02852
Awake fMRI Reveals Brain Regions for Novel Word Detection in Dogs
How do dogs understand human words? At a basic level, understanding would require the discrimination of words from non-words. To determine the mechanisms of such a discrimination, we trained 12 dogs to retrieve two objects based on object names, then probed the neural basis for these auditory discriminations using awake-fMRI. We compared the neural response to these trained words relative to “oddball” pseudowords the dogs had not heard before. Consistent with novelty detection, we found greater activation for pseudowords relative to trained words bilaterally in the parietotemporal cortex. To probe the neural basis for representations of trained words, searchlight multivoxel pattern analysis (MVPA) revealed that a subset of dogs had clusters of informative voxels that discriminated between the two trained words. These clusters included the left temporal cortex and amygdala, left caudate nucleus, and thalamus. These results demonstrate that dogs’ processing of human words utilizes basic processes like novelty detection, and for some dogs, may also include auditory and hedonic representations
Recommended from our members
The role of HG in the analysis of temporal iteration and interaural correlation
Recommended from our members
Auditory neuroscience: the salience of looming sounds
Sounds that move towards us have a greater biological salience than those that move away. Recent studies in human and non-human primates demonstrate a perceptual and behavioural priority for such looming sounds that is also reflected in an asymmetric pattern of cortical activation
Audio-visual detection benefits in the rat
Human psychophysical studies have described multisensory perceptual benefits such as enhanced detection rates and faster reaction times in great detail. However, the neural circuits and mechanism underlying multisensory integration remain difficult to study in the primate brain. While rodents offer the advantage of a range of experimental methodologies to study the neural basis of multisensory processing, rodent studies are still limited due to the small number of available multisensory protocols. We here demonstrate the feasibility of an audio-visual stimulus detection task for rats, in which the animals detect lateralized uni- and multi-sensory stimuli in a two-response forced choice paradigm. We show that animals reliably learn and perform this task. Reaction times were significantly faster and behavioral performance levels higher in multisensory compared to unisensory conditions. This benefit was strongest for dim visual targets, in agreement with classical patterns of multisensory integration, and was specific to task-informative sounds, while uninformative sounds speeded reaction times with little costs for detection performance. Importantly, multisensory benefits for stimulus detection and reaction times appeared at different levels of task proficiency and training experience, suggesting distinct mechanisms inducing these two multisensory benefits. Our results demonstrate behavioral multisensory enhancement in rats in analogy to behavioral patterns known from other species, such as humans. In addition, our paradigm enriches the set of behavioral tasks on which future studies can rely, for example to combine behavioral measurements with imaging or pharmacological studies in the behaving animal or to study changes of integration properties in disease models
The upper frequency limit for the use of phase locking to code temporal fine structure in humans:A compilation of viewpoints
The relative importance of neural temporal and place coding in auditory perception is still a matter of much debate. The current article is a compilation of viewpoints from leading auditory psychophysicists and physiologists regarding the upper frequency limit for the use of neural phase locking to code temporal fine structure in humans. While phase locking is used for binaural processing up to about 1500 Hz, there is disagreement regarding the use of monaural phase-locking information at higher frequencies. Estimates of the general upper limit proposed by the contributors range from 1500 to 10000 Hz. The arguments depend on whether or not phase locking is needed to explain psychophysical discrimination performance at frequencies above 1500 Hz, and whether or not the phase-locked neural representation is sufficiently robust at these frequencies to provide useable information. The contributors suggest key experiments that may help to resolve this issue, and experimental findings that may cause them to change their minds. This issue is of crucial importance to our understanding of the neural basis of auditory perception in general, and of pitch perception in particular
- …