2,406 research outputs found
Cerebral substrates of musical imagery
Musical imagery refers to the experience of replaying music by imagining it inside the head. Whereas visual imagery has been extensively studied, few people have investigated imagery in the auditory domain. This article reviews a program of research that has tried to characterize auditory imagery for music using both behavioral and cognitive neuroscientific tools. I begin by describing some of my behavioral studies of the mental analogues of musical tempo, pitch, and temporal extent. I then describe four studies using three techniques that examine the correspondence of brain involvement in actually perceiving vs. imagining familiar music. These involve one lesion study with epilepsy surgery patients, two positron emission tomography (PET) studies, and one study using transcranial magnetic stimulation (TMS). The studies converge on the importance of the right temporal neocortex and other right-hemisphere structures in the processing of both perceived and imagined nonverbal music. Perceiving and imagining songs that have words also involve structures in the left hemisphere. The supplementary motor area (SMA) is activated during musical imagery; it may mediate rehearsal that involves motor programs, such as imagined humming. Future studies are suggested that would involve imagery of sounds that cannot be produced by the vocal tract to clarify the role of the SMA in auditory imagery
Interhemispheric Connectivity Influences the Degree of Modulation of TMS-Induced Effects during Auditory Processing
Repetitive transcranial magnetic stimulation (rTMS) has been shown to interfere with many components of language processing, including semantic, syntactic, and phonologic. However, not much is known about its effects on nonlinguistic auditory processing, especially its action on Heschl's gyrus (HG). We aimed to investigate the behavioral and neural basis of rTMS during a melody processing task, while targeting the left HG, the right HG, and the Vertex as a control site. Response times (RT) were normalized relative to the baseline-rTMS (Vertex) and expressed as percentage change from baseline (%RT change). We also looked at sex differences in rTMS-induced response as well as in functional connectivity during melody processing using rTMS and functional magnetic resonance imaging (fMRI). fMRI results showed an increase in the right HG compared with the left HG during the melody task, as well as sex differences in functional connectivity indicating a greater interhemispheric connectivity between left and right HG in females compared with males. TMS results showed that 10 Hz-rTMS targeting the right HG induced differential effects according to sex, with a facilitation of performance in females and an impairment of performance in males. We also found a differential correlation between the %RT change after 10 Hz-rTMS targeting the right HG and the interhemispheric functional connectivity between right and left HG, indicating that an increase in interhemispheric functional connectivity was associated with a facilitation of performance. This is the first study to report a differential rTMS-induced interference with melody processing depending on sex. In addition, we showed a relationship between the interference induced by rTMS on behavioral performance and the neural activity in the network connecting left and right HG, suggesting that the interhemispheric functional connectivity could determine the degree of modulation of behavioral performance
Effect of unilateral temporal-lobe excision on perception and imagery of songs
Auditory imagery for songs was studied in two groups of patients with left or right temporal-lobe excision for control of epilepsy, and a group of matched normal control subjects. Two tasks were used. In the perceptual task, subjects saw the text of a familiar song and simultaneously heard it sung. On each trial they judged if the second of two capitalized lyrics was higher or lower in pitch than the first. The imagery task was identical in all respects except that no song was presented, so that subjects had to generate an auditory image of the song. The results indicated that all subjects found the imagery task more difficult than the perceptual task, but patients with right temporal-lobe damage performed significantly worse on both tasks than either patients with left temporal-lobe lesions or normal control subjects. These results support the idea that imagery arises from activation of a neural substrate shared with perceptual mechanisms, and provides evidence for a right temporal- lobe specialization for this type of auditory imaginal processing
Identification, discrimination, and selective adaptation of simultaneous musical intervals
Four experiments investigated perception of major and minor thirds whose component tones were sounded simultaneously. Effects akin to categorical perception of speech sounds were found. In the first experiment, musicians demonstrated relatively sharp category boundaries in identification and peaks near the boundary in discrimination tasks of an interval continuum where the bottom note was always an F and the top note varied from A to A flat in seven equal logarithmic steps. Nonmusicians showed these effects only to a small extent. The musicians showed higher than predicted discrimination performance overall, and reaction time increases at category boundaries. In the second experiment, musicians failed to consistently identify or discriminate thirds which varied in absolute pitch, but retained the proper interval ratio. In the last two experiments, using selective adaptation, consistent shifts were found in both identification and discrimination, similar to those found in speech experiments. Manipulations of adapting and test showed that the mechanism underlying the effect appears to be centrally mediated and confined to a frequency-specific level. A multistage model of interval perception, where the first stages deal only with specific pitches may account for the results
Mental Concerts: Musical Imagery and Auditory Cortex
Most people intuitively understand what it means to “hear a tune in your head.” Converging evidence now indicates that auditory cortical areas can be recruited even in the absence of sound and that this corresponds to the phenomenological experience of imagining music. We discuss these findings as well as some methodological challenges. We also consider the role of core versus belt areas in musical imagery, the relation between auditory and motor systems during imagery of music performance, and practical implications of this research
Mental reversal of imagined melodies: A role for the posterior parietal cortex
Two fMRI experiments explored the neural substrates of a musical imagery task that required manipulation of the imagined sounds: temporal reversal of a melody. Musicians were presented with the first few notes of a familiar tune (Experiment 1) or its title (Experiment 2), followed by a string of notes that was either an exact or an inexact reversal. The task was to judge whether the second string was correct or not by mentally reversing all its notes, thus requiring both maintenance and manipulation of the represented string. Both experiments showed considerable activation of the superior parietal lobe (intraparietal sulcus) during the reversal process. Ventrolateral and dorsolateral frontal cortices were also activated, consistent with the memory load required during the task. We also found weaker evidence for some activation of right auditory cortex in both studies, congruent with results from previous simpler music imagery tasks. We interpret these results in the context of other mental transformation tasks, such as mental rotation in the visual domain, which are known to recruit the intraparietal sulcus region, and we propose that this region subserves general computations that require transformations of a sensory input. Mental imagery tasks may thus have both task or modality-specific components as well as components that supersede any specific codes and instead represent amodal mental manipulation
Cortical Correlates of the Auditory Frequency-Following and Onset Responses: EEG and fMRI Evidence
The frequency-following response (FFR) is a measure of the brain\u27s periodic sound encoding. It is of increasing importance for studying the human auditory nervous system due to numerous associations with auditory cognition and dysfunction. Although the FFR is widely interpreted as originating from brainstem nuclei, a recent study using MEG suggested that there is also a right-lateralized contribution from the auditory cortex at the fundamental frequency (Coffey et al., 2016b). Our objectives in the present work were to validate and better localize this result using a completely different neuroimaging modality and to document the relationships between the FFR, the onset response, and cortical activity. Using a combination of EEG, fMRI, and diffusion-weighted imaging, we show that activity in the right auditory cortex is related to individual differences in FFR–fundamental frequency (f0) strength, a finding that was replicated with two independent stimulus sets, with and without acoustic energy at the fundamental frequency. We demonstrate a dissociation between this FFR–f0-sensitive response in the right and an area in left auditory cortex that is sensitive to individual differences in the timing of initial response to sound onset. Relationships to timing and their lateralization are supported by parallels in the microstructure of the underlying white matter, implicating a mechanism involving neural conduction efficiency. These data confirm that the FFR has a cortical contribution and suggest ways in which auditory neuroscience may be advanced by connecting early sound representation to measures of higher-level sound processing and cognitive function
Music, memory and mechanisms in Alzheimer’s disease
This scientific commentary refers to ‘Why musical memory can be preserved in advanced Alzheimer’s disease’, by Jacobsen et al. (doi:10.1093/brain/awv135)
Common parietal activation in musical mental transformations across pitch and time
We previously observed that mental manipulation of the pitch level or temporal organization of melodies results in functional activation in the human intraparietal sulcus (IPS), a region also associated with visuospatial transformation and numerical calculation. Two outstanding questions about these musical transformations are whether pitch and time depend on separate or common processing in IPS, and whether IPS recruitment in melodic tasks varies depending upon the degree of transformation required (as it does in mental rotation). In the present study we sought to answer these questions by applying functional magnetic resonance imaging while musicians performed closely matched mental transposition (pitch transformation) and melody reversal (temporal transformation) tasks. A voxel-wise conjunction analysis showed that in individual subjects, both tasks activated overlapping regions in bilateral IPS, suggesting that a common neural substrate subserves both types of mental transformation. Varying the magnitude of mental pitch transposition resulted in variation of IPS BOLD signal in correlation with the musical key-distance of the transposition, but not with the pitch distance, indicating that the cognitive metric relevant for this type of operation is an abstract one, well described by music-theoretic concepts. These findings support a general role for the IPS in systematically transforming auditory stimulus representations in a nonspatial context. (C) 2013 Elsevier Inc. All rights reserved
- …