895 research outputs found

    Cognitive Components of Regularity Processing in the Auditory Domain

    Get PDF
    BACKGROUND: Music-syntactic irregularities often co-occur with the processing of physical irregularities. In this study we constructed chord-sequences such that perceived differences in the cognitive processing between regular and irregular chords could not be due to the sensory processing of acoustic factors like pitch repetition or pitch commonality (the major component of 'sensory dissonance'). METHODOLOGY/PRINCIPAL FINDINGS: Two groups of subjects (musicians and nonmusicians) were investigated with electroencephalography (EEG). Irregular chords elicited an early right anterior negativity (ERAN) in the event-related brain potentials (ERPs). The ERAN had a latency of around 180 ms after the onset of the music-syntactically irregular chords, and had maximum amplitude values over right anterior electrode sites. CONCLUSIONS/SIGNIFICANCE: Because irregular chords were hardly detectable based on acoustical factors (such as pitch repetition and sensory dissonance), this ERAN effect reflects for the most part cognitive (not sensory) components of regularity-based, music-syntactic processing. Our study represents a methodological advance compared to previous ERP-studies investigating the neural processing of music-syntactically irregular chords

    The periodic repolarization dynamics index identifies changes in ventricular repolarization oscillations associated with music-induced emotions

    Get PDF
    The effect of music on cardiovascular dynamics may be useful in a variety of clinical settings. The aim of this study was to assess whether listening to music characterized by different emotional valence affected ventricular periodic repolarization dynamics (PRD), a recently-proposed non-invasive index of sympathetic ventricular modulation. The 12 lead ECG was recorded in 71 healthy volunteers exposed to six 90 s excerpts of pleasant music and unpleasant acoustic stimuli as well as six 90 s intervals of silence. A 20 s interval was allowed between excerpts during which the participants were asked to evaluate the previous excerpt. A simulation study was carried out to assess the capability of the algorithm of tracking fast small changes in PRD. The simulation study shows that the algorithm implemented in this study has a time-frequency resolution sufficient to capture the fast dynamics observed in this study. PRD were higher during listening to both pleasant and unpleasant music than during silence. There was a (weak) trend for the PRD to be higher during listening to pleasant than unpleasant music that may indicate the existence of a (weak) interaction between the valence of music-induced emotions and sympathetic ventricular response. The PRD significantly increased during the 20 s interval in between conditions, possibly reflecting a sympathetic response to the evaluation task and/or to the expectation of the following excerpt

    Effects of Unexpected Chords and of Performer's Expression on Brain Responses and Electrodermal Activity

    Get PDF
    BACKGROUND: There is lack of neuroscientific studies investigating music processing with naturalistic stimuli, and brain responses to real music are, thus, largely unknown. METHODOLOGY/PRINCIPAL FINDINGS: This study investigates event-related brain potentials (ERPs), skin conductance responses (SCRs) and heart rate (HR) elicited by unexpected chords of piano sonatas as they were originally arranged by composers, and as they were played by professional pianists. From the musical excerpts played by the pianists (with emotional expression), we also created versions without variations in tempo and loudness (without musical expression) to investigate effects of musical expression on ERPs and SCRs. Compared to expected chords, unexpected chords elicited an early right anterior negativity (ERAN, reflecting music-syntactic processing) and an N5 (reflecting processing of meaning information) in the ERPs, as well as clear changes in the SCRs (reflecting that unexpected chords also elicited emotional responses). The ERAN was not influenced by emotional expression, whereas N5 potentials elicited by chords in general (regardless of their chord function) differed between the expressive and the non-expressive condition. CONCLUSIONS/SIGNIFICANCE: These results show that the neural mechanisms of music-syntactic processing operate independently of the emotional qualities of a stimulus, justifying the use of stimuli without emotional expression to investigate the cognitive processing of musical structure. Moreover, the data indicate that musical expression affects the neural mechanisms underlying the processing of musical meaning. Our data are the first to reveal influences of musical performance on ERPs and SCRs, and to show physiological responses to unexpected chords in naturalistic music

    Pitch discriminiation accuracy in musicians vs nonmusicians: an event-related potential and behavioral study

    Get PDF
    Previously, professional violin players were found to automatically discriminate tiny pitch changes, not discriminable by nonmusicians. The present study addressed the pitch processing accuracy in musicians with expertise in playing a wide selection of instruments (e.g., piano; wind and string instruments). Of specific interest was whether also musicians with such divergent backgrounds have facilitated accuracy in automatic and/or attentive levels of auditory processing. Thirteen professional musicians and 13 nonmusicians were presented with frequent standard sounds and rare deviant sounds (0.8, 2, or 4% higher in frequency). Auditory event-related potentials evoked by these sounds were recorded while first the subjects read a self-chosen book and second they indicated behaviorally the detection of sounds with deviant frequency. Musicians detected the pitch changes faster and more accurately than nonmusicians. The N2b and P3 responses recorded during attentive listening had larger amplitude in musicians than in nonmusicians. Interestingly, the superiority in pitch discrimination accuracy in musicians over nonmusicians was observed not only with the 0.8% but also with the 2% frequency changes. Moreover, also nonmusicians detected quite reliably the smallest pitch changes of 0.8%. However, the mismatch negativity (MMN) and P3a recorded during a reading condition did not differentiate musicians and nonmusicians. These results suggest that musical expertise may exert its effects merely at attentive levels of processing and not necessarily already at the preattentive levels

    Music perception in cochlear implant users: An event-related potential study

    No full text
    Objective : Compare the processing of music-syntactic irregularities and physical oddballs between cochlear implant (CI) users and matched controls. Methods : Musical chord sequences were presented, some of which contained functionally irregular chords, or a chord with an instrumental timbre that deviated from the standard timbre. Results : In both controls and CI users, functionally irregular chords elicited early (around 200 ms) and late (around 500 ms) negative electric brain responses (early right anterior negativity,ERAN and N5). Amplitudes of effects depended on the degree of music-syntactic irregularity in both groups; effects elicited in CI users were distinctly smaller than in controls. Physically deviant chords elicited a timbre- mismatch negativity (MMN) and a P3 in both groups, again with smaller amplitudes in CI users. Conclusions : ERAN and N5 (as well as timbre-MMN and P3), can be elicited in CI users. Although amplitudes of effects were considerably smaller in the CI group, the presence of MMN and ERAN indicates that neural mechanisms of both physical and music- syntactic irregularity-detection were active in this group. q 2004 International Federation of Clinical Neurophysiology. Published by Elsevier Ireland Ltd. All rights reserve

    Comparing the Processing of Music and Language Meaning Using EEG and fMRI Provides Evidence for Similar and Distinct Neural Representations

    Get PDF
    Recent demonstrations that music is capable of conveying semantically meaningful information has raised several questions as to what the underlying mechanisms of establishing meaning in music are, and if the meaning of music is represented in comparable fashion to language meaning. This paper presents evidence showing that expressed affect is a primary pathway to music meaning and that meaning in music is represented in a very similar fashion to language meaning. In two experiments using EEG and fMRI, it was shown that single chords varying in harmonic roughness (consonance/dissonance) and thus perceived affect could prime the processing of subsequently presented affective target words, as indicated by an increased N400 and activation of the right middle temporal gyrus (MTG). Most importantly, however, when primed by affective words, single chords incongruous to the preceding affect also elicited an N400 and activated the right posterior STS, an area implicated in processing meaning of a variety of signals (e.g. prosody, voices, motion). This provides an important piece of evidence in support of music meaning being represented in a very similar but also distinct fashion to language meaning: Both elicit an N400, but activate different portions of the right temporal lobe

    The gray matter volume of the amygdala is correlated with the perception of melodic intervals: a voxel-based morphometry study

    Get PDF
    Music is not simply a series of organized pitches, rhythms, and timbres, it is capable of evoking emotions. In the present study, voxel-based morphometry (VBM) was employed to explore the neural basis that may link music to emotion. To do this, we identified the neuroanatomical correlates of the ability to extract pitch interval size in a music segment (i.e., interval perception) in a large population of healthy young adults (N = 264). Behaviorally, we found that interval perception was correlated with daily emotional experiences, indicating the intrinsic link between music and emotion. Neurally, and as expected, we found that interval perception was positively correlated with the gray matter volume (GMV) of the bilateral temporal cortex. More important, a larger GMV of the bilateral amygdala was associated with better interval perception, suggesting that the amygdala, which is the neural substrate of emotional processing, is also involved in music processing. In sum, our study provides one of first neuroanatomical evidence on the association between the amygdala and music, which contributes to our understanding of exactly how music evokes emotional responses

    Analog Computer Research

    Get PDF
    Contains reports on three research projects
    • 

    corecore