3,225 research outputs found

    Infants segment words from songs - an EEG study

    No full text
    Children’s songs are omnipresent and highly attractive stimuli in infants’ input. Previous work suggests that infants process linguistic–phonetic information from simplified sung melodies. The present study investigated whether infants learn words from ecologically valid children’s songs. Testing 40 Dutch-learning 10-month-olds in a familiarization-then-test electroencephalography (EEG) paradigm, this study asked whether infants can segment repeated target words embedded in songs during familiarization and subsequently recognize those words in continuous speech in the test phase. To replicate previous speech work and compare segmentation across modalities, infants participated in both song and speech sessions. Results showed a positive event-related potential (ERP) familiarity effect to the final compared to the first target occurrences during both song and speech familiarization. No evidence was found for word recognition in the test phase following either song or speech. Comparisons across the stimuli of the present and a comparable previous study suggested that acoustic prominence and speech rate may have contributed to the polarity of the ERP familiarity effect and its absence in the test phase. Overall, the present study provides evidence that 10-month-old infants can segment words embedded in songs, and it raises questions about the acoustic and other factors that enable or hinder infant word segmentation from songs and speech

    Different theta connectivity patterns underlie pleasantness evoked by familiar and unfamiliar music

    Get PDF
    Music-evoked pleasantness has been extensively reported to be modulated by familiarity. Nevertheless, while the brain temporal dynamics underlying the process of giving value to music are beginning to be understood, little is known about how familiarity might modulate the oscillatory activity associated with music-evoked pleasantness. The goal of the present experiment was to study the influence of familiarity in the relation between theta phase synchronization and music-evoked pleasantness. EEG was recorded from 22 healthy participants while they were listening to both familiar and unfamiliar music and rating the experienced degree of evoked pleasantness. By exploring interactions, we found that right fronto-temporal theta synchronization was positively associated with music-evoked pleasantness when listening to unfamiliar music. On the contrary, inter-hemispheric temporo-parietal theta synchronization was positively associated with music-evoked pleasantness when listening to familiar music. These results shed some light on the possible oscillatory mechanisms underlying fronto-temporal and temporo-parietal connectivity and their relationship with music-evoked pleasantness and familiarity

    Rapid Brain Responses to Familiar vs. Unfamiliar Music – an EEG and Pupillometry study

    Get PDF
    Human listeners exhibit marked sensitivity to familiar music, perhaps most readily revealed by popular “name that tune” games, in which listeners often succeed in recognizing a familiar song based on extremely brief presentation. In this work, we used electroencephalography (EEG) and pupillometry to reveal the temporal signatures of the brain processes that allow differentiation between a familiar, well liked, and unfamiliar piece of music. In contrast to previous work, which has quantified gradual changes in pupil diameter (the so-called “pupil dilation response”), here we focus on the occurrence of pupil dilation events. This approach is substantially more sensitive in the temporal domain and allowed us to tap early activity with the putative salience network. Participants (N = 10) passively listened to snippets (750 ms) of a familiar, personally relevant and, an acoustically matched, unfamiliar song, presented in random order. A group of control participants (N = 12), who were unfamiliar with all of the songs, was also tested. We reveal a rapid differentiation between snippets from familiar and unfamiliar songs: Pupil responses showed greater dilation rate to familiar music from 100–300 ms post-stimulus-onset, consistent with a faster activation of the autonomic salience network. Brain responses measured with EEG showed a later differentiation between familiar and unfamiliar music from 350 ms post onset. Remarkably, the cluster pattern identified in the EEG response is very similar to that commonly found in the classic old/new memory retrieval paradigms, suggesting that the recognition of brief, randomly presented, music snippets, draws on similar processes
    corecore