23 research outputs found

    Detecting violations of temporal regularities in waking and sleeping two-month-old infants

    Get PDF
    Correctly processing rapid sequences of sounds is essential for developmental milestones, such as language acquisition. We investigated the sensitivity of two-month-old infants to violations of a temporal regularity, by recording event-related brain potentials (ERP) in an auditory oddball paradigm from 36 waking and 40 sleeping infants. Standard tones were presented at a regular 300 ms inter-stimulus interval (ISI). One deviant, otherwise identical to the standard, was preceded by a 100 ms ISI. Two other deviants, presented with the standard ISI, differed from the standard in their spectral makeup. We found significant differences between ERP responses elicited by the standard and each of the deviant sounds. The results suggest that the ability to extract both temporal and spectral regularities from a sound sequence is already functional within the first few months of life. The scalp distribution of all three deviant-stimulus responses was influenced by the infants‟ state of alertness

    Multimodal processing of emotional information in 9-month-old infants II: prenatal exposure to maternal anxiety

    Get PDF
    The ability to read emotional expressions from human face and voice is an important skill in our day-today interactions with others. How this ability develops may be influenced by atypical experiences early in life. Here, we investigated multimodal processing of fearful and happy face/voice pairs in 9-month-olds prenatally exposed to maternal anxiety, using event-related potentials (ERPs). Infants were presented with emotional vocalisations (happy/fearful) preceded by emotional facial expressions (happy/fearful). The results revealed larger P350 amplitudes in response to fearful vocalisations when infants had been exposed to higher levels of anxiety, regardless of the type of visual prime, which may indicate increased attention to fearful vocalisations. A trend for a positive association between P150 amplitudes and maternal anxiety scores during pregnancy may suggest these infants are more easily aroused by and extract features more thoroughly from fearful vocalisations as well. These findings are compatible with the hypothesis that prenatal exposure to maternal anxiety is related to more extensive processing of fearrelated stimuli. (C) 2014 Elsevier Inc. All rights reserved

    Multimodal processing of emotional information in 9-month-old infants I: emotional faces and voices

    Get PDF
    Making sense of emotions manifesting in human voice is an important social skill which is influenced by emotions in other modalities, such as that of the corresponding face. Although processing emotional information from voices and faces simultaneously has been studied in adults, little is known about the neural mechanisms underlying the development of this ability in infancy. Here we investigated multimodal processing of fearful and happy face/voice pairs using event-related potential (ERP) measures in a group of 84 9-month-olds. Infants were presented with emotional vocalisations (fearful/happy) preceded by the same or a different facial expression (fearful/happy). The ERP data revealed that the processing of emotional information appearing in human voice was modulated by the emotional expression appearing on the corresponding face: Infants responded with larger auditory ERPs after fearful compared to happy facial primes. This finding suggests that infants dedicate more processing capacities to potentially threatening than to non-threatening stimuli. Keywords: Infant, Multimodal processing, Audiovisual processing, Event-related potential (ERP), Emotion perception, Affective primin
    corecore