415 research outputs found

    The gray matter volume of the amygdala is correlated with the perception of melodic intervals: a voxel-based morphometry study

    Get PDF
    Music is not simply a series of organized pitches, rhythms, and timbres, it is capable of evoking emotions. In the present study, voxel-based morphometry (VBM) was employed to explore the neural basis that may link music to emotion. To do this, we identified the neuroanatomical correlates of the ability to extract pitch interval size in a music segment (i.e., interval perception) in a large population of healthy young adults (N = 264). Behaviorally, we found that interval perception was correlated with daily emotional experiences, indicating the intrinsic link between music and emotion. Neurally, and as expected, we found that interval perception was positively correlated with the gray matter volume (GMV) of the bilateral temporal cortex. More important, a larger GMV of the bilateral amygdala was associated with better interval perception, suggesting that the amygdala, which is the neural substrate of emotional processing, is also involved in music processing. In sum, our study provides one of first neuroanatomical evidence on the association between the amygdala and music, which contributes to our understanding of exactly how music evokes emotional responses

    Structural Integration in Language and Music: Evidence for a Shared System.

    Get PDF
    In this study, we investigate whether language and music share cognitive resources for structural processing. We report an experiment that used sung materials and manipulated linguistic complexity (subject-extracted relative clauses, object-extracted relative clauses) and musical complexity (in-key critical note, out-of-key critical note, auditory anomaly on the critical note involving a loudness increase). The auditory-anomaly manipulation was included in order to test whether the difference between in-key and out-of-key conditions might be due to any salient, unexpected acoustic event. The critical dependent measure involved comprehension accuracies to questions about the propositional content of the sentences asked at the end of each trial. The results revealed an interaction between linguistic and musical complexity such that the difference between the subject- and object-extracted relative clause conditions was larger in the out-of-key condition than in the in-key and auditory-anomaly conditions. These results provide evidence for an overlap in structural processing between language and music

    Enhanced Syllable Discrimination Thresholds in Musicians

    Get PDF
    Speech processing inherently relies on the perception of specific, rapidly changing spectral and temporal acoustic features. Advanced acoustic perception is also integral to musical expertise, and accordingly several studies have demonstrated a significant relationship between musical training and superior processing of various aspects of speech. Speech and music appear to overlap in spectral and temporal features; however, it remains unclear which of these acoustic features, crucial for speech processing, are most closely associated with musical training. The present study examined the perceptual acuity of musicians to the acoustic components of speech necessary for intra-phonemic discrimination of synthetic syllables. We compared musicians and non-musicians on discrimination thresholds of three synthetic speech syllable continua that varied in their spectral and temporal discrimination demands, specifically voice onset time (VOT) and amplitude envelope cues in the temporal domain. Musicians demonstrated superior discrimination only for syllables that required resolution of temporal cues. Furthermore, performance on the temporal syllable continua positively correlated with the length and intensity of musical training. These findings support one potential mechanism by which musical training may selectively enhance speech perception, namely by reinforcing temporal acuity and/or perception of amplitude rise time, and implications for the translation of musical training to long-term linguistic abilities.Grammy FoundationWilliam F. Milton Fun

    fMRI scanner noise interaction with affective neural processes

    Get PDF
    The purpose of the present study was the investigation of interaction effects between functional MRI scanner noise and affective neural processes. Stimuli comprised of psychoacoustically balanced musical pieces, expressing three different emotions (fear, neutral, joy). Participants (N=34, 19 female) were split into two groups, one subjected to continuous scanning and another subjected to sparse temporal scanning that features decreased scanner noise. Tests for interaction effects between scanning group (sparse/quieter vs continuous/noisier) and emotion (fear, neutral, joy) were performed. Results revealed interactions between the affective expression of stimuli and scanning group localized in bilateral auditory cortex, insula and visual cortex (calcarine sulcus). Post-hoc comparisons revealed that during sparse scanning, but not during continuous scanning, BOLD signals were significantly stronger for joy than for fear, as well as stronger for fear than for neutral in bilateral auditory cortex. During continuous scanning, but not during sparse scanning, BOLD signals were significantly stronger for joy than for neutral in the left auditory cortex and for joy than for fear in the calcarine sulcus. To the authors' knowledge, this is the first study to show a statistical interaction effect between scanner noise and affective processes and extends evidence suggesting scanner noise to be an important factor in functional MRI research that can affect and distort affective brain processes

    Evidence for Shared Cognitive Processing of Pitch in Music and Language

    Get PDF
    Language and music epitomize the complex representational and computational capacities of the human mind. Strikingly similar in their structural and expressive features, a longstanding question is whether the perceptual and cognitive mechanisms underlying these abilities are shared or distinct – either from each other or from other mental processes. One prominent feature shared between language and music is signal encoding using pitch, conveying pragmatics and semantics in language and melody in music. We investigated how pitch processing is shared between language and music by measuring consistency in individual differences in pitch perception across language, music, and three control conditions intended to assess basic sensory and domain-general cognitive processes. Individuals’ pitch perception abilities in language and music were most strongly related, even after accounting for performance in all control conditions. These results provide behavioral evidence, based on patterns of individual differences, that is consistent with the hypothesis that cognitive mechanisms for pitch processing may be shared between language and music.National Science Foundation (U.S.). Graduate Research Fellowship ProgramEunice Kennedy Shriver National Institute of Child Health and Human Development (U.S.) (Grant 5K99HD057522

    Electromagnetic Correlates of Musical Expertise in Processing of Tone Patterns

    Get PDF
    Using magnetoencephalography (MEG), we investigated the influence of long term musical training on the processing of partly imagined tone patterns (imagery condition) compared to the same perceived patterns (perceptual condition). The magnetic counterpart of the mismatch negativity (MMNm) was recorded and compared between musicians and non-musicians in order to assess the effect of musical training on the detection of deviants to tone patterns. The results indicated a clear MMNm in the perceptual condition as well as in a simple pitch oddball (control) condition in both groups. However, there was no significant mismatch response in either group in the imagery condition despite above chance behavioral performance in the task of detecting deviant tones. The latency and the laterality of the MMNm in the perceptual condition differed significantly between groups, with an earlier MMNm in musicians, especially in the left hemisphere. In contrast the MMNm amplitudes did not differ significantly between groups. The behavioral results revealed a clear effect of long-term musical training in both experimental conditions. The obtained results represent new evidence that the processing of tone patterns is faster and more strongly lateralized in musically trained subjects, which is consistent with other findings in different paradigms of enhanced auditory neural system functioning due to long-term musical training

    Faster maturation of selective attention in musically trained children and adolescents : Converging behavioral and event-related potential evidence

    Get PDF
    Previous work suggests that musical training in childhood is associated with enhanced executive functions. However, it is unknown whether this advantage extends to selective attention-another central aspect of executive control. We recorded a well-established event-related potential (ERP) marker of distraction, the P3a, during an audio-visual task to investigate the maturation of selective attention in musically trained children and adolescents aged 10-17 years and a control group of untrained peers. The task required categorization of visual stimuli, while a sequence of standard sounds and distracting novel sounds were presented in the background. The music group outperformed the control group in the categorization task and the younger children in the music group showed a smaller P3a to the distracting novel sounds than their peers in the control group. Also, a negative response elicited by the novel sounds in the N1/MMN time range (similar to 150-200 ms) was smaller in the music group. These results indicate that the music group was less easily distracted by the task-irrelevant sound stimulation and gated the neural processing of the novel sounds more efficiently than the control group. Furthermore, we replicated our previous finding that, relative to the control group, the musically trained children and adolescents performed faster in standardized tests for inhibition and set shifting. These results provide novel converging behavioral and electrophysiological evidence from a cross-modal paradigm for accelerated maturation of selective attention in musically trained children and adolescents and corroborate the association between musical training and enhanced inhibition and set shifting.Peer reviewe
    corecore