88 research outputs found

    A Melodic Contour Repeatedly Experienced by Human Near-Term Fetuses Elicits a Profound Cardiac Reaction One Month after Birth

    Get PDF
    Human hearing develops progressively during the last trimester of gestation. Near-term fetuses can discriminate acoustic features, such as frequencies and spectra, and process complex auditory streams. Fetal and neonatal studies show that they can remember frequently recurring sounds. However, existing data can only show retention intervals up to several days after birth.Here we show that auditory memories can last at least six weeks. Experimental fetuses were given precisely controlled exposure to a descending piano melody twice daily during the 35(th), 36(th), and 37(th) weeks of gestation. Six weeks later we assessed the cardiac responses of 25 exposed infants and 25 naive control infants, while in quiet sleep, to the descending melody and to an ascending control piano melody. The melodies had precisely inverse contours, but similar spectra, identical duration, tempo and rhythm, thus, almost identical amplitude envelopes. All infants displayed a significant heart rate change. In exposed infants, the descending melody evoked a cardiac deceleration that was twice larger than the decelerations elicited by the ascending melody and by both melodies in control infants.Thus, 3-weeks of prenatal exposure to a specific melodic contour affects infants 'auditory processing' or perception, i.e., impacts the autonomic nervous system at least six weeks later, when infants are 1-month old. Our results extend the retention interval over which a prenatally acquired memory of a specific sound stream can be observed from 3-4 days to six weeks. The long-term memory for the descending melody is interpreted in terms of enduring neurophysiological tuning and its significance for the developmental psychobiology of attention and perception, including early speech perception, is discussed

    Action–effect anticipation in infant action control

    Get PDF
    There is increasing evidence that action effects play a crucial role in action understanding and action control not only in adults but also in infants. Most of the research in infants focused on the learning of action–effect contingencies or how action effects help infants to infer goals in other persons’ actions. In contrast, the present research aimed at demonstrating that infants control their own actions by action–effect anticipation once they know about specific action–effect relations. About 7 and 9-month olds observed an experimenter demonstrating two actions that differed regarding the action–effect assignment. Either a red-button press or a blue-button press or no button press elicited interesting acoustical and visual effects. The 9-month olds produced the effect action at first, with shorter latency and longer duration sustaining a direct impact of action–effect anticipation on action control. In 7-month olds the differences due to action–effect manipulation were less profound indicating developmental changes at this age

    Language experience impacts brain activation for spoken and signed language in infancy: Insights from unimodal and bimodal bilinguals

    Get PDF
    Recent neuroimaging studies suggest that monolingual infants activate a left lateralised fronto-temporal brain network in response to spoken language, which is similar to the network involved in processing spoken and signed language in adulthood. However, it is unclear how brain activation to language is influenced by early experience in infancy. To address this question, we present functional near infrared spectroscopy (fNIRS) data from 60 hearing infants (4-to-8 months): 19 monolingual infants exposed to English, 20 unimodal bilingual infants exposed to two spoken languages, and 21 bimodal bilingual infants exposed to English and British Sign Language (BSL). Across all infants, spoken language elicited activation in a bilateral brain network including the inferior frontal and posterior temporal areas, while sign language elicited activation in the right temporo-parietal area. A significant difference in brain lateralisation was observed between groups. Activation in the posterior temporal region was not lateralised in monolinguals and bimodal bilinguals, but right lateralised in response to both language modalities in unimodal bilinguals. This suggests that experience of two spoken languages influences brain activation for sign language when experienced for the first time. Multivariate pattern analyses (MVPA) could classify distributed patterns of activation within the left hemisphere for spoken and signed language in monolinguals (proportion correct = 0.68; p = 0.039) but not in unimodal or bimodal bilinguals. These results suggest that bilingual experience in infancy influences brain activation for language, and that unimodal bilingual experience has greater impact on early brain lateralisation than bimodal bilingual experience

    Neonates’ responses to repeated exposure to a still face

    Get PDF
    The main aims of the study were to examine whether human neonates' responses to communication disturbance modelled by the still-face paradigm were stable and whether their responses were affected by their previous experience with the still-face paradigm.The still face procedure, as a laboratory model of interpersonal stress, was administered repeatedly, twice, to 84 neonates (0 to 4 day olds), with a delay of an average of 1.25 day.Frame-by-frame analysis of the frequency and duration of gaze, distressed face, crying, sleeping and sucking behaviours showed that the procedure was stressful to them both times, that is, the still face effect was stable after repeated administration and newborns consistently responded to such nonverbal violation of communication. They averted their gaze, showed distress and cried more during the still-face phase in both the first and the second administration. They also showed a carry-over effect in that they continued to avert their gaze and displayed increased distress and crying in the first reunion period, but their gaze behaviour changed with experience, in the second administration. While in the first administration the babies continued averting their gaze even after the stressful still-face phase was over, this carry-over effect disappeared in the second administration, and the babies significantly increased their gaze following the still-face phase.After excluding explanations of fatigue, habituation and random effects, a self-other regulatory model is discussed as a possible explanation for this pattern

    The evolution of language: a comparative review

    Get PDF
    For many years the evolution of language has been seen as a disreputable topic, mired in fanciful "just so stories" about language origins. However, in the last decade a new synthesis of modern linguistics, cognitive neuroscience and neo-Darwinian evolutionary theory has begun to make important contributions to our understanding of the biology and evolution of language. I review some of this recent progress, focusing on the value of the comparative method, which uses data from animal species to draw inferences about language evolution. Discussing speech first, I show how data concerning a wide variety of species, from monkeys to birds, can increase our understanding of the anatomical and neural mechanisms underlying human spoken language, and how bird and whale song provide insights into the ultimate evolutionary function of language. I discuss the ‘‘descended larynx’ ’ of humans, a peculiar adaptation for speech that has received much attention in the past, which despite earlier claims is not uniquely human. Then I will turn to the neural mechanisms underlying spoken language, pointing out the difficulties animals apparently experience in perceiving hierarchical structure in sounds, and stressing the importance of vocal imitation in the evolution of a spoken language. Turning to ultimate function, I suggest that communication among kin (especially between parents and offspring) played a crucial but neglected role in driving language evolution. Finally, I briefly discuss phylogeny, discussing hypotheses that offer plausible routes to human language from a non-linguistic chimp-like ancestor. I conclude that comparative data from living animals will be key to developing a richer, more interdisciplinary understanding of our most distinctively human trait: language
    • …
    corecore