16 research outputs found

    Independent component processes underlying emotions during natural music listening

    Full text link
    The aim of this study was to investigate the brain processes underlying emotions during natural music listening. To address this, we recorded high-density electroencephalography (EEG) from 22 subjects while presenting a set of individually matched whole musical excerpts varying in valence and arousal. Independent component analysis was applied to decompose the EEG data into functionally distinct brain processes. A k-means cluster analysis calculated on the basis of a combination of spatial (scalp topography and dipole location mapped onto the Montreal Neurological Institute brain template) and functional (spectra) characteristics revealed 10 clusters referring to brain areas typically involved in music and emotion processing, namely in the proximity of thalamic-limbic and orbitofrontal regions as well as at frontal, fronto-parietal, parietal, parieto-occipital, temporo-occipital and occipital areas. This analysis revealed that arousal was associated with a suppression of power in the alpha frequency range. On the other hand, valence was associated with an increase in theta frequency power in response to excerpts inducing happiness compared to sadness. These findings are partly compatible with the model proposed by Heller, arguing that the frontal lobe is involved in modulating valenced experiences (the left frontal hemisphere for positive emotions) whereas the right parieto-temporal region contributes to the emotional arousal

    Auditory aversion in absolute pitch possessors

    Full text link
    Absolute pitch (AP) refers to the ability of identifying the pitch of a given tone without reliance on any reference pitch. The downside of possessing AP may be the experience of disturbance when exposed to out-of-tune tones. Here, we investigated this so-far unexplored phenomenon in AP, which we refer to as auditory aversion. Electroencephalography (EEG) was recorded in a sample of AP possessors and matched control musicians without AP while letting them perform a task underlying a so-called affective priming paradigm: Participants judged valenced pictures preceded by musical primes as quickly and accurately as possible. The primes were bimodal, presented as tones in combination with visual notations that either matched or mismatched the actually presented tone. Both samples performed better in judging unpleasant pictures over pleasant ones. In comparison with the control musicians, the AP possessors revealed a more profound discrepancy between the two valence conditions, and their EEG revealed later peaks at around 200 ms (P200) after prime onset. Their performance dropped when responding to pleasant pictures preceded by incongruent primes, especially when mistuned by one semitone. This interference was also reflected in an EEG deflection at around 400 ms (N400) after picture onset, preceding the behavior responses. These findings suggest that AP possessors process mistuned musical stimuli and pleasant pictures as affectively unrelated with each other, supporting an aversion towards out-of-tune tones in AP possessors. The longer prime-related P200 latencies exhibited by AP possessors suggest a delay in integrating musical stimuli, underlying an altered affinity towards pitch-label associations

    Time course of EEG oscillations during repeated listening of a well-known aria

    Get PDF
    While previous studies have analyzed mean neurophysiological responses to musical stimuli, the current study aimed to identify specific time courses of electroencephalography (EEG) oscillations, which are associated with dynamic changes in the acoustic features of the musical stimulus. In addition, we were interested in whether these time courses change during a repeated presentation of the same musical piece. A total of 16 subjects repeatedly listened to the well-known aria “Nessun dorma,” sung by Paul Potts, while continuous 128-channel EEG and heart rate, as well as electrodermal responses, were recorded. The time courses for the EEG oscillations were calculated using a time resolution of 1 second for several frequency bands, on the basis of individual alpha-peak frequencies (theta, low alpha-1, low alpha-2, upper alpha, and beta). For all frequency bands, we identified a more or less continuous increase in power relative to a baseline period, indicating strong event-related synchronization (ERS) during music listening. The ERS time courses, however, did not correlate strongly with the time courses of the acoustic features of the aria. In addition, we did not observe changes in EEG oscillations after repeated presentation of the same musical piece. Aside from this distinctive feature, we identified a remarkable variability in EEG oscillations, both within and between the repeated presentations of the aria. We interpret the continuous increase in ERS observed in all frequency bands during music listening as an indicator of a particular neurophysiological and psychological state evoked by music listening. We suggest that this state is characterized by increased internal attention (accompanied by reduced external attention), increased inhibition of brain networks not involved in the generation of this internal state, the maintenance of a particular level of general alertness, and a type of brain state that can be described as “mind wandering.” The overall state can be categorized as a psychological process that may be seen as a “drawing in” to the musical piece. However, this state is not stable and varies considerably throughout the music listening session and across subjects. Most important, however, is the finding that the neurophysiological activations occurring during music listening are dynamic and not stationary

    Warum der Mensch Musik liebt

    Full text link
    Musikalität ist eine Grundeigenschaft des Homo sapiens. War diese Fähigkeit zur musischen Leidenschaft gar ein Überlebensvorteil während der Evolution? Erkenntnisse aus der neuropsychologischen Entzauberung der Musik aus der Feder von Lars Rogenmoser

    The left dorsal stream causally mediates the tone labeling in absolute pitch

    Get PDF
    Absolute pitch (AP) refers to the ability to effortlessly identify given pitches without any reference. Correlative evidence suggests that the left posterior dorsolateral prefrontal cortex (DLPFC) is responsible for the process underlying pitch labeling in AP. Here, we measured the sight-reading performance of right-handed AP possessors and matched controls under cathodal and sham transcranial direct current stimulation of the left DLPFC. The participants were instructed to report notations as accurately and as fast as possible by playing with their right hand on a piano. The notations were simultaneously presented with distracting auditory stimuli that either matched or mismatched them in different semitone degrees. Unlike the controls, AP possessors revealed an interference effect in that they responded slower in mismatching conditions than in the matching one. Under cathodal stimulation, this interference effect disappeared. These findings confirm that the pitch-labeling process underlying AP occurs automatically and is largely nonsuppressible when triggered by tone exposure. The improvement of the AP possessors' sight-reading performances in response to the suppression of the left DLPFC using cathodal stimulation confirms a causal relationship between this brain structure and pitch labeling

    Absolute pitch: evidence for early cognitive facilitation during passive listening as revealed by reduced P3a amplitudes

    Full text link
    Absolute pitch (AP) is the rare ability to identify or produce different pitches without using reference tones. At least two sequential processing stages are assumed to contribute to this phenomenon. The first recruits a pitch memory mechanism at an early stage of auditory processing, whereas the second is driven more by a later cognitive mechanism (pitch labeling). Several investigations have used active tasks, but it is unclear how these two mechanisms contribute to AP during passive listening. The present work investigated the temporal dynamics of tone processing in AP and non-AP (NAP)participants, using EEG. We applied a passive oddball paradigm with between- and within-tone category manipulations and analyzed the MMN reflecting the early stage of auditory processing and, for the first time, the P3a response reflecting the later cognitive mechanism during the second processing stage. Results did not reveal between-group differences in MMN waveforms. But, the P3a response was specifically associated with AP and sensitive to processing different pitch types. Specifically, AP participants exhibited smaller P3a amplitudes, especially in between-tone category conditions, and P3a responses correlated significantly with the age of commencement of musical training, suggesting an influence of early musical exposure on AP. Our results reinforce the current opinion that the representation of pitches at the processing level of the auditory-related cortex is similar among AP and NAP participants, whereas the later processing stage is critical for AP. Results are interpreted as reflecting cognitive facilitation in AP participants, possibly driven by the availability of multiple codes for tones

    Independent component processes underlying emotions during natural music listening

    No full text
    The aim of this study was to investigate the brain processes underlying emotions during natural music listening. To address this, we recorded high-density electroencephalography (EEG) from 22 subjects while presenting a set of individually matched whole musical excerpts varying in valence and arousal. Independent component analysis was applied to decompose the EEG data into functionally distinct brain processes. A k-means cluster analysis calculated on the basis of a combination of spatial (scalp topography and dipole location mapped onto the Montreal Neurological Institute brain template) and functional (spectra) characteristics revealed 10 clusters referring to brain areas typically involved in music and emotion processing, namely in the proximity of thalamic-limbic and orbitofrontal regions as well as at frontal, fronto-parietal, parietal, parieto-occipital, temporo-occipital and occipital areas. This analysis revealed that arousal was associated with a suppression of power in the alpha frequency range. On the other hand, valence was associated with an increase in theta frequency power in response to excerpts inducing happiness compared to sadness. These findings are partly compatible with the model proposed by Heller, arguing that the frontal lobe is involved in modulating valenced experiences (the left frontal hemisphere for positive emotions) whereas the right parieto-temporal region contributes to the emotional arousal.ISSN:1749-5016ISSN:1749-502

    Bridging the gap between perceptual and cognitive perspectives on absolute pitch

    Full text link
    Absolute pitch (AP) refers to the rare ability to identify the chroma of a tone or to produce a specific pitch without reference to keyality (e.g., G or C). Previously, AP has been proposed to rely on the distinctive functional-anatomical architecture of the left auditory-related cortex (ARC), this specific trait possibly enabling an optimized early "categorical perception". In contrast, currently prevailing models of AP postulate that cognitive rather than perceptual processes, namely "pitch labeling" mechanisms, more likely constitute the bearing skeleton of AP. This associative memory component has previously been proposed to be dependent, among other mechanisms, on the recruitment of the left dorsolateral prefrontal cortex (DLPFC) as well as on the integrity of the left arcuate fasciculus, a fiber bundle linking the posterior supratemporal plane with the DLPFC. Here, we attempted to integrate these two apparently conflicting perspectives on AP, namely early "categorical perception" and "pitch labeling". We used electroencephalography and evaluated resting-state intracranial functional connectivity between the left ARC and DLPFC in a sample of musicians with and without AP. Results demonstrate significantly increased left-hemispheric theta phase synchronization in AP compared with non-AP musicians. Within the AP group, this specific electrophysiological marker was predictive of absolute-hearing behavior and explained ∼30% of variance. Thus, we propose that in AP subjects the tonal inputs and the corresponding mnemonic representations are tightly coupled in such a manner that the distinctive electrophysiological signature of AP can saliently be detected in only 3 min of resting-state measurements

    Independent component processes underlying emotions during natural music listening

    No full text
    The aim of this study was to investigate the brain processes underlying emotions during natural music listening. To address this, we recorded high-density electroencephalography (EEG) from 22 subjects while presenting a set of individually matched whole musical excerpts varying in valence and arousal. Independent component analysis was applied to decompose the EEG data into functionally distinct brain processes. A k-means cluster analysis calculated on the basis of a combination of spatial (scalp topography and dipole location mapped onto the Montreal Neurological Institute brain template) and functional (spectra) characteristics revealed 10 clusters referring to brain areas typically involved in music and emotion processing, namely in the proximity of thalamic-limbic and orbitofrontal regions as well as at frontal, fronto-parietal, parietal, parieto-occipital, temporo-occipital and occipital areas. This analysis revealed that arousal was associated with a suppression of power in the alpha frequency range. On the other hand, valence was associated with an increase in theta frequency power in response to excerpts inducing happiness compared to sadness. These findings are partly compatible with the model proposed by Heller, arguing that the frontal lobe is involved in modulating valenced experiences (the left frontal hemisphere for positive emotions) whereas the right parieto-temporal region contributes to the emotional arousal
    corecore