51 research outputs found

    Advances in the Neurocognition of Music and Language

    Full text link
    Neurocomparative music and language research has seen major advances over the past two decades. The goal of this Special Issue "Advances in the Neurocognition of Music and Language" was to showcase the multiple neural analogies between musical and linguistic information processing, their entwined organization in human perception and cognition and to infer the applicability of the combined knowledge in pedagogy and therapy. Here, we summarize the main insights provided by the contributions and integrate them into current frameworks of rhythm processing, neuronal entrainment, predictive coding and cognitive control

    Hippocampal sclerosis affects fMR-adaptation of lyrics and melodies in songs

    Get PDF
    Songs constitute a natural combination of lyrics and melodies, but it is unclear whether and how these two song components are integrated during the emergence of a memory trace. Network theories of memory suggest a prominent role of the hippocampus, together with unimodal sensory areas, in the build-up of conjunctive representations. The present study tested the modulatory influence of the hippocampus on neural adaptation to songs in lateral temporal areas. Patients with unilateral hippocampal sclerosis and healthy matched controls were presented with blocks of short songs in which lyrics and/or melodies were varied or repeated in a crossed factorial design. Neural adaptation effects were taken as correlates of incidental emergent memory traces. We hypothesized that hippocampal lesions, particularly in the left hemisphere, would weaken adaptation effects, especially the integration of lyrics and melodies. Results revealed that lateral temporal lobe regions showed weaker adaptation to repeated lyrics as well as a reduced interaction of the adaptation effects for lyrics and melodies in patients with left hippocampal sclerosis. This suggests a deficient build-up of a sensory memory trace for lyrics and a reduced integration of lyrics with melodies, compared to healthy controls. Patients with right hippocampal sclerosis showed a similar profile of results although the effects did not reach significance in this population. We highlight the finding that the integrated representation of lyrics and melodies typically shown in healthy participants is likely tied to the integrity of the left medial temporal lobe. This novel finding provides the first neuroimaging evidence for the role of the hippocampus during repetitive exposure to lyrics and melodies and their integration into a song

    Instantaneous Neural Processing of Communicative Functions Conveyed by Speech Prosody

    Get PDF
    During conversations, speech prosody provides important clues about the speaker’s communicative intentions. In many languages, a rising vocal pitch at the end of a sentence typically expresses a question function, whereas a falling pitch suggests a statement. Here, the neurophysiological basis of intonation and speech act understanding were investigated with high-density electroencephalography (EEG) to determine whether prosodic features are reflected at the neurophysiological level. Already approximately 100 ms after the sentence-final word differing in prosody, questions, and statements expressed with the same sentences led to different neurophysiological activity recorded in the event-related potential. Interestingly, low-pass filtered sentences and acoustically matched nonvocal musical signals failed to show any neurophysiological dissociations, thus suggesting that the physical intonation alone cannot explain this modulation. Our results show rapid neurophysiological indexes of prosodic communicative information processing that emerge only when pragmatic and lexico-semantic information are fully expressed. The early enhancement of question-related activity compared with statements was due to sources in the articulatory-motor region, which may reflect the richer action knowledge immanent to questions, namely the expectation of the partner action of answering the question. The present findings demonstrate a neurophysiological correlate of prosodic communicative information processing, which enables humans to rapidly detect and understand speaker intentions in linguistic interactions

    Electrocorticographic Activation Patterns of Electroencephalographic Microstates.

    Get PDF
    Electroencephalography (EEG) microstates are short successive periods of stable scalp field potentials representing spontaneous activation of brain resting-state networks. EEG microstates are assumed to mediate local activity patterns. To test this hypothesis, we correlated momentary global EEG microstate dynamics with the local temporo-spectral evolution of electrocorticography (ECoG) and stereotactic EEG (SEEG) depth electrode recordings. We hypothesized that these correlations involve the gamma band. We also hypothesized that the anatomical locations of these correlations would converge with those of previous studies using either combined functional magnetic resonance imaging (fMRI)-EEG or EEG source localization. We analyzed resting-state data (5 min) of simultaneous noninvasive scalp EEG and invasive ECoG and SEEG recordings of two participants. Data were recorded during the presurgical evaluation of pharmacoresistant epilepsy using subdural and intracranial electrodes. After standard preprocessing, we fitted a set of normative microstate template maps to the scalp EEG data. Using covariance mapping with EEG microstate timelines and ECoG/SEEG temporo-spectral evolutions as inputs, we identified systematic changes in the activation of ECoG/SEEG local field potentials in different frequency bands (theta, alpha, beta, and high-gamma) based on the presence of particular microstate classes. We found significant covariation of ECoG/SEEG spectral amplitudes with microstate timelines in all four frequency bands (p = 0.001, permutation test). The covariance patterns of the ECoG/SEEG electrodes during the different microstates of both participants were similar. To our knowledge, this is the first study to demonstrate distinct activation/deactivation patterns of frequency-domain ECoG local field potentials associated with simultaneous EEG microstates

    Right ventral stream damage underlies both poststroke aprosodia and amusia

    Get PDF
    Background and purpose: This study was undertaken to determine and compare lesion patterns and structural dysconnectivity underlying poststroke aprosodia and amusia, using a data-driven multimodal neuroimaging approach. Methods: Thirty-nine patients with right or left hemisphere stroke were enrolled in a cohort study and tested for linguistic and affective prosody perception and musical pitch and rhythm perception at subacute and 3-month poststroke stages. Participants listened to words spoken with different prosodic stress that changed their meaning, and to words spoken with six different emotions, and chose which meaning or emotion was expressed. In the music tasks, participants judged pairs of short melodies as the same or different in terms of pitch or rhythm. Structural magnetic resonance imaging data were acquired at both stages, and machine learning-based lesion-symptom mapping and deterministic tractography were used to identify lesion patterns and damaged white matter pathways giving rise to aprosodia and amusia. Results: Both aprosodia and amusia were behaviorally strongly correlated and associated with similar lesion patterns in right frontoinsular and striatal areas. In multiple regression models, reduced fractional anisotropy and lower tract volume of the right inferior fronto-occipital fasciculus were the strongest predictors for both disorders, over time. Conclusions: These results highlight a common origin of aprosodia and amusia, both arising from damage and disconnection of the right ventral auditory stream integrating rhythmic-melodic acoustic information in prosody and music. Comorbidity of these disabilities may worsen the prognosis and affect rehabilitation success.Peer reviewe

    Modulation of neural activity in frontopolar cortex drives reward-based motor learning

    Get PDF
    The frontopolar cortex (FPC) contributes to tracking the reward of alternative choices during decision making, as well as their reliability. Whether this FPC function extends to reward gradients associated with continuous movements during motor learning remains unknown. We used anodal transcranial direct current stimulation (tDCS) over the right FPC to investigate its role in reward-based motor learning. Nineteen healthy human participants practiced novel sequences of finger movements on a digital piano with corresponding auditory feedback. Their aim was to use trialwise reward feedback to discover a hidden performance goal along a continuous dimension: timing. We additionally modulated the contralateral motor cortex (left M1) activity, and included a control sham stimulation. Right FPC-tDCS led to faster learning compared to lM1-tDCS and sham through regulation of motor variability. Bayesian computational modelling revealed that in all stimulation protocols, an increase in the trialwise expectation of reward was followed by greater exploitation, as shown previously. Yet, this association was weaker in lM1-tDCS suggesting a less efficient learning strategy. The effects of frontopolar stimulation were dissociated from those induced by lM1-tDCS and sham, as motor exploration was more sensitive to inferred changes in the reward tendency (volatility). The findings suggest that rFPC-tDCS increases the sensitivity of motor exploration to updates in reward volatility, accelerating reward-based motor learning

    Cognitive Components of Regularity Processing in the Auditory Domain

    Get PDF
    BACKGROUND: Music-syntactic irregularities often co-occur with the processing of physical irregularities. In this study we constructed chord-sequences such that perceived differences in the cognitive processing between regular and irregular chords could not be due to the sensory processing of acoustic factors like pitch repetition or pitch commonality (the major component of 'sensory dissonance'). METHODOLOGY/PRINCIPAL FINDINGS: Two groups of subjects (musicians and nonmusicians) were investigated with electroencephalography (EEG). Irregular chords elicited an early right anterior negativity (ERAN) in the event-related brain potentials (ERPs). The ERAN had a latency of around 180 ms after the onset of the music-syntactically irregular chords, and had maximum amplitude values over right anterior electrode sites. CONCLUSIONS/SIGNIFICANCE: Because irregular chords were hardly detectable based on acoustical factors (such as pitch repetition and sensory dissonance), this ERAN effect reflects for the most part cognitive (not sensory) components of regularity-based, music-syntactic processing. Our study represents a methodological advance compared to previous ERP-studies investigating the neural processing of music-syntactically irregular chords
    corecore