18 research outputs found

    Hidden sources of joy, fear, and sadness : Explicit versus implicit neural processing of musical emotions

    Get PDF
    Music is often used to regulate emotions and mood. Typically, music conveys and induces emotions even when one does not attend to them. Studies on the neural substrates of musical emotions have, however, only examined brain activity when subjects have focused on the emotional content of the music. Here we address with functional magnetic resonance imaging (fMRI) the neural processing of happy, sad, and fearful music with a paradigm in which 56 subjects were instructed to either classify the emotions (explicit condition) or pay attention to the number of instruments playing (implicit condition) in 4-s music clips. In the implicit vs. explicit condition, stimuli activated bilaterally the inferior parietal lobule, premotor cortex, caudate, and ventromedial frontal areas. The cortical dorsomedial prefrontal and occipital areas activated during explicit processing were those previously shown to be associated with the cognitive processing of music and emotion recognition and regulation. Moreover, happiness in music was associated with activity in the bilateral auditory cortex, left parahippocampal gyrus, and supplementary motor area, whereas the negative emotions of sadness and fear corresponded with activation of the left anterior cingulate and middle frontal gyrus and down-regulation of the orbitofrontal cortex. Our study demonstrates for the first time in healthy subjects the neural underpinnings of the implicit processing of brief musical emotions, particularly in frontoparietal, dorsolateral prefrontal, and striatal areas of the brain. (C) 2016 Elsevier Ltd. All rights reserved.Peer reviewe

    Action in Perception : Prominent Visuo-Motor Functional Symmetry in Musicians during Music Listening

    Get PDF
    Musical training leads to sensory and motor neuroplastic changes in the human brain. Motivated by findings on enlarged corpus callosum in musicians and asymmetric somatomotor representation in string players, we investigated the relationship between musical training, callosal anatomy, and interhemispheric functional symmetry during music listening. Functional symmetry was increased in musicians compared to nonmusicians, and in keyboardists compared to string players. This increased functional symmetry was prominent in visual and motor brain networks. Callosal size did not significantly differ between groups except for the posterior callosum in musicians compared to nonmusicians. We conclude that the distinctive postural and kinematic symmetry in instrument playing cross-modally shapes information processing in sensory-motor cortical areas during music listening. This cross-modal plasticity suggests that motor training affects music perception.Peer reviewe

    Brain integrative function driven by musical training during real-world music listening

    No full text
    The present research investigated differences in the brain dynamics of continuous, real-world music listening between listeners with and without professional musical training, using functional magnetic resonance imaging (fMRI). A replication study was aimed at validating the reliability of the naturalistic approach to studying brain responses to music, wherein the brain signal and the acoustic information extracted from the musical stimulus were correlated. After a successful replication, a series of three studies dealt with differences in integrative brain function during music listening between musicians and nonmusicians. Findings (a) emphasized the crucial role of the distinctive postural and kinematic symmetry in instrument playing on the symmetry of brain responses to music listening, evidencing a crossmodal transfer of symmetry from sensorimotor to perceptual processing systems; (b) provided novel evidence for increased cerebello-hippocampal functional coupling in musicians as a function of musical predictability compared to nonmusicians, likely mediated by action simulation mechanisms; (c) highlighted differences in pulse clarity processing between groups and uncovered an associated action-perception network overlapping with areas previously observed to tightly interact in rhythm processing. In conclusion, the present research findings, obtained using a naturalistic auditory stimulation paradigm, will advance the understanding of brain integrative function during real-world music listening, in listeners with and without musical expertise. Particularly, this thesis has implications for a better understanding of training-induced crossmodal reorganization. The new evidence brought by the present findings will hopefully guide the generation and development of future testable hypotheses

    Dynamics of brain activity underlying working memory for music in a naturalistic condition

    No full text
    Working memory (WM) is at the core of any cognitive function as it is necessary for the integration of information over time. Despite WM’s critical role in high-level cognitive functions, its implementation in the neural tissue is poorly understood. Preliminary studies on auditory WM show differences between linguistic and musical memory, leading to the speculation of specific neural networks encoding memory for music. Moreover, in neuroscience WM has not been studied in naturalistic listening conditions but rather in artificial settings (e.g., n-back and Sternberg tasks). Western tonal music provides naturally occurring motivic repetition and variation, recognizable units serving as WM trigger, thus allowing us to study the phenomenon of motif-tracking in the context of real music. Adopting a modern tango as stimulus, behavioural methods were used to identify the stimulus motifs and build a time-course predictor of WM neural responses. This predictor was then correlated with the participants’ functional magnetic resonance imaging (fMRI) signal obtained during a continuous listening condition. Neural correlates related to the sensory processing of a set of musical features were filtered out from the brain responses to music to aid in the exclusive recruitment of executive processes of music-related WM. Correlational analysis revealed a widely distributed network of cortical and subcortical areas, predominantly right-lateralized, responding to the WM condition, including ventral and dorsal areas in the prefrontal cortex, basal ganglia, and limbic areas. Significant subcortical processing areas, active in response to the WM condition, were pruned with the removal of the acoustic content, suggesting these music-related perceptual processing areas might aid in the encoding and retrieval of WM. The pattern of dispersed neural activity indicates WM to emerge coherently from the integration of distributed neural activity spread out over different brain subsystems (motoric-, cognitive- and sensory-related areas of the brain)

    Coupling of Action-Perception Brain Networks during Musical Pulse Processing : Evidence from Region-of-Interest-Based Independent Component Analysis

    No full text
    Our sense of rhythm relies on orchestrated activity of several cerebral and cerebellar structures. Although functional connectivity studies have advanced our understanding of rhythm perception, this phenomenon has not been sufficiently studied as a function of musical training and beyond the General Linear Model (GLM) approach. Here, we studied pulse clarity processing during naturalistic music listening using a data-driven approach (independent component analysis; ICA). Participants’ (18 musicians and 18 controls) functional magnetic resonance imaging (fMRI) responses were acquired while listening to music. A targeted region of interest (ROI) related to pulse clarity processing was defined, comprising auditory, somatomotor, basal ganglia, and cerebellar areas. The ICA decomposition was performed under different model orders, i.e., under a varying number of assumed independent sources, to avoid relying on prior model order assumptions. The components best predicted by a measure of the pulse clarity of the music, extracted computationally from the musical stimulus, were identified. Their corresponding spatial maps uncovered a network of auditory (perception) and motor (action) areas in an excitatory-inhibitory relationship at lower model orders, while mainly constrained to the auditory areas at higher model orders. Results revealed (a) a strengthened functional integration of action-perception networks associated with pulse clarity perception hidden from GLM analyses, and (b) group differences between musicians and non-musicians in pulse clarity processing, suggesting lifelong musical training as an important factor that may influence beat processing.peerReviewe

    Decoding Musical Training from Dynamic Processing of Musical Features in the Brain

    No full text
    Pattern recognition on neural activations from naturalistic music listening has been successful at predicting neural responses of listeners from musical features, and vice versa. Inter-subject differences in the decoding accuracies have arisen partly from musical training that has widely recognized structural and functional effects on the brain. We propose and evaluate a decoding approach aimed at predicting the musicianship class of an individual listener from dynamic neural processing of musical features. Whole brain functional magnetic resonance imaging (fMRI) data was acquired from musicians and nonmusicians during listening of three musical pieces from different genres. Six musical features, representing low-level (timbre) and high-level (rhythm and tonality) aspects of music perception, were computed from the acoustic signals, and classification into musicians and nonmusicians was performed on the musical feature and parcellated fMRI time series. Cross-validated classification accuracy reached 77% with nine regions, comprising frontal and temporal cortical regions, caudate nucleus, and cingulate gyrus. The processing of high-level musical features at right superior temporal gyrus was most influenced by listeners’ musical training. The study demonstrates the feasibility to decode musicianship from how individual brains listen to music, attaining accuracy comparable to current results from automated clinical diagnosis of neurological and psychological disorders.peerReviewe

    Influence of Musical Expertise on the processing of Musical Features in a Naturalistic Setting

    No full text
    Musical training causes structural and functional changes in the brain due to its sensory-motor demands, but the modulatory effect of musical training on music feature processing in the brain in a continuous music listening paradigm, has not been investigated thus far. In this work, we investigate the differences between musicians and non-musicians in the encoding of musical features encompassing musical timbre, rhythm and tone. 18 musicians and 18 non-musicians were scanned using fMRI while listening to 3 varied stimuli. Acoustic features corresponding to timbre, rhythm and tone were computationally extracted from the stimuli and correlated with brain responses, followed by t-tests on group level maps to uncover encoding differences between the two groups. The musicians demonstrated greater involvement of limbic and reward regions, and regions possessing adaptations to music processing due to training, indicating greater analytic processing. However, as a group, they did not exhibit large regions of consistent correlation patterns, especially in processing high-level features, due to differences in processing strategies arising out of their varied training. The non-musicians exhibited broader regions of correlations, implying greater similarities in bottom-up sensory processing.nonPeerReviewe

    The chronnectome of musical beat

    No full text
    Keeping time is fundamental for our everyday existence. Various isochronous activities, such as locomotion, require us to use internal timekeeping. This phenomenon comes into play also in other human pursuits such as dance and music. When listening to music, we spontaneously perceive and predict its beat. The process of beat perception comprises both beat inference and beat maintenance, their relative importance depending on the salience of beat in the music. To study functional connectivity associated with these processes in a naturalistic situation, we used functional magnetic resonance imaging to measure brain responses of participants while they were listening to a piece of music containing strong contrasts in beat salience. Subsequently, we utilized dynamic graph analysis and psychophysiological interactions (PPI) analysis in connection with computational modelling of beat salience to investigate how functional connectivity manifests these processes. As the main effect, correlation analyses between the obtained dynamic graph measures and the beat salience measure revealed increased centrality in auditory-motor cortices, cerebellum, and extrastriate visual areas during low beat salience, whereas regions of the default mode- and central executive networks displayed high centrality during high beat salience. PPI analyses revealed partial dissociation of functional networks belonging to this pathway indicating complementary neural mechanisms crucial in beat inference and maintenance, processes pivotal for extracting and predicting temporal regularities in our environment

    On application of kernel PCA for generating stimulus features for fMRI during continuous music listening

    No full text
    Background There has been growing interest towards naturalistic neuroimaging experiments, which deepen our understanding of how human brain processes and integrates incoming streams of multifaceted sensory information, as commonly occurs in real world. Music is a good example of such complex continuous phenomenon. In a few recent fMRI studies examining neural correlates of music in continuous listening settings, multiple perceptual attributes of music stimulus were represented by a set of high-level features, produced as the linear combination of the acoustic descriptors computationally extracted from the stimulus audio. New method fMRI data from naturalistic music listening experiment were employed here. Kernel principal component analysis (KPCA) was applied to acoustic descriptors extracted from the stimulus audio to generate a set of nonlinear stimulus features. Subsequently, perceptual and neural correlates of the generated high-level features were examined. Results The generated features captured musical percepts that were hidden from the linear PCA features, namely Rhythmic Complexity and Event Synchronicity. Neural correlates of the new features revealed activations associated to processing of complex rhythms, including auditory, motor, and frontal areas. Comparison with existing method Results were compared with the findings in the previously published study, which analyzed the same fMRI data but applied linear PCA for generating stimulus features. To enable comparison of the results, methodology for finding stimulus-driven functional maps was adopted from the previous study. Conclusions Exploiting nonlinear relationships among acoustic descriptors can lead to the novel high-level stimulus features, which can in turn reveal new brain structures involved in music processing.peerReviewe

    Music style not only modulates the auditory cortex, but also motor related areas

    No full text
    The neuroscience of music has recently attracted significant attention, but the effect of music style on the activation of auditory-motor regions has not been explored. The aim of the present study is to analyze the differences in brain activity during passive listening to non-vocal excerpts of four different music genres (classical, reggaeton, electronic and folk). A functional magnetic resonance imaging (fMRI) experiment was performed. Twenty-eight participants with no musical training were included in the study. They had to passively listen to music excerpts of the above genres during fMRI acquisition. Imaging analysis was performed at the whole-brain-level and in auditory-motor regions of interest (ROIs). Furthermore, the musical competence of each participant was measured and its relationship with brain activity in the studied ROIs was analyzed. The whole brain analysis showed higher brain activity during reggaeton listening than the other music genres in auditory-related areas. The ROI-analysis showed that reggaeton led to higher activity not only in auditory related areas, but also in some motor related areas, mainly when it was compared with classical music. A positive relationship between the melodic-MET score and brain activity during reggaeton listening was identified in some auditory and motor related areas. The findings revealed that listening to different music styles in musically inexperienced subjects elicits different brain activity in auditory and motor related areas. Reggaeton was, among the studied music genres, the one that evoked the highest activity in the auditory-motor network. These findings are discussed in connection with acoustic analyses of the musical stimuli.peerReviewe
    corecore