111 research outputs found

    Multisensory perception of looming and receding objects in human newborns

    Get PDF
    When newborns leave the enclosed spatial environment of the uterus and arrive in the outside world, they are faced with a new audiovisual environment of dynamic objects, actions and events both close to themselves and further away. One particular challenge concerns matching and making sense of the visual and auditory cues specifying object motion [1-5]. Previous research shows that adults prioritise the integration of auditory and visual information indicating looming (for example [2]) and that rhesus monkeys can integrate multisensory looming, but not receding, audiovisual stimuli [4]. Despite the clear adaptive value of correctly perceiving motion towards or away from the self - for defence against and physical interaction with moving objects - such a perceptual ability would clearly be undermined if newborns were unable to correctly match the auditory and visual cues to such motion. This multisensory perceptual skill has scarcely been studied in human ontogeny. Here we report that newborns only a few hours old are sensitive to matches between changes in visual size and in auditory intensity. This early multisensory competence demonstrates that, rather than being entirely na\uefve to their new audiovisual environment, newborns can make sense of the multisensory cue combinations specifying motion with respect to themselves

    Direct gaze modulates face recognition in young infants

    Get PDF
    From birth, infants prefer to look at faces that engage them in direct eye contact. In adults, direct gaze is known to modulate the processing of faces, including the recognition of individuals. In the present study, we investigate whether direction of gaze has any effect on face recognition in four-month-old infants. Four-month infants were shown faces with both direct and averted gaze, and subsequently given a preference test involving the same face and a novel one. A novelty preference during test was only found following initial exposure to a face with direct gaze. Further, face recognition was also generally enhanced for faces with both direct and with averted gaze when the infants started the task with the direct gaze condition. Together, these results indicate that the direction of the gaze modulates face recognition in early infancy

    Perception of visual and audiovisual trajectories toward and away from the body in the first postnatal year

    Get PDF
    Perceiving motion in depth is important in everyday life, especially motion in relation to the body. Visual and auditory cues inform us about motion in space when presented in isolation from each other, but the most comprehensive information is obtained through the combination of both of these cues. We traced the development of infants’ ability to discriminate between visual motion trajectories across peripersonal space and to match these with auditory cues specifying the same peripersonal motion. We measured 5-month-old (n = 20) and 9-month-old (n = 20) infants’ visual preferences for visual motion toward or away from their body (presented simultaneously and side by side) across three conditions: (a) visual displays presented alone, (b) paired with a sound increasing in intensity, and (c) paired with a sound decreasing in intensity. Both groups preferred approaching motion in the visual-only condition. When the visual displays were paired with a sound increasing in intensity, neither group showed a visual preference. When a sound decreasing in intensity was played instead, the 5-month-olds preferred the receding (spatiotemporally congruent) visual stimulus, whereas the 9-month-olds preferred the approaching (spatiotemporally incongruent) visual stimulus. We speculate that in the approaching sound condition, the behavioral salience of the sound could have led infants to focus on the auditory information alone, in order to prepare a motor response, and to neglect the visual stimuli. In the receding sound condition, instead, the difference in response patterns in the two groups may have been driven by infants’ emerging motor abilities and their developing predictive processing mechanisms supporting and influencing each other

    Evidence for Referential Expectation in Four-Month-Old Infants

    Get PDF
    Infants’ sensitivity to selectively attend human speech and to process it in a unique way has been widely reported in the past. However, in order to successfully acquire language, one should also understand that speech is a referential symbol system, and that words can stand for other entities in the world. While there has been some evidence showing that young infants can make inferences about the communicative intentions of a speaker, whether they would also appreciate the direct relation between a specific word and its referent, is still unknown. In the present study we tested four-month-old infants to see whether they would expect to find a referent when they hear human speech. Our results showed that compared to other auditory stimulus or to silence, when infants were listening to speech they were more prepared to find some visual referents of the words, but only if the speaker also provided additional referential cues. Thus, our study is the first to report evidence that infants at a very young age already appreciate the symbolic nature of language and that they understand the referential relation between auditory words and physical objects, even if they do not have yet any knowledge about the meanings of words

    Can you see what i am talking about? Human speech triggers referential expectation in four-month-old infants

    Get PDF
    Infants’ sensitivity to selectively attend to human speech and to process it in a unique way has been widely reported in the past. However, in order to successfully acquire language, one should also understand that speech is a referential, and that words can stand for other entities in the world. While there has been some evidence showing that young infants can make inferences about the communicative intentions of a speaker, whether they would also appreciate the direct relationship between a specific word and its referent, is still unknown. In the present study we tested four-month-old infants to see whether they would expect to find a referent when they hear human speech. Our results showed that compared to other auditory stimuli or to silence, when infants were listening to speech they were more prepared to find some visual referents of the words, as signalled by their faster orienting towards the visual objects. Hence, our study is the first to report evidence that infants at a very young age already understand the referential relationship between auditory words and physical objects, thus show a precursor in appreciating the symbolic nature of language, even if they do not understand yet the meanings of words

    Identifying peripersonal space boundaries in newborns

    Get PDF
    Peripersonal space immediately surrounds the body and can be represented in the brain as a multisensory and sensorimotor interface mediating physical and social interactions between body and environment. Very little consideration has been given to the ontogeny of peripersonal spatial representations in early postnatal life, despite the crucial roles of peripersonal space and its adaptive relevance as the space where infants\u2019 earliest interactions take place. Here, we investigated whether peripersonal space could be considered a delimited portion of space with defined boundaries soon after birth. Our findings showed for the first time that newborns\u2019 saccadic reaction times to a tactile stimulus simultaneous to sounds with different intensities changed based on the sound intensity. In particular, they were significantly faster when the sound was lounder than a critical intensity, in a pattern that closely resembled that showed by adults. Therefore, provided that sound intensity on its own can cue newborns\u2019 sound distance perception, we speculate that this critical distance could be considered the boundary of newborns\u2019 rudimentary peripersonal space. Altogether, our findings suggest that soon after birth peripersonal space may be already considered as a bounded portion of space, perhaps instrumental to drive newborns\u2019 attention towards events and people within it

    Exposure to linguistic labels during childhood modulates the neural architecture of race categorical perception

    Get PDF
    Perceptually categorizing a face to its racial belonging may have important consequences on interacting with people. However, race categorical perception (CP) has been scarcely investigated nor its developmental pathway. In this study, we tested the neurolinguistics rewiring hypothesis, stating that language acquisition modulates the brain processing of social perceptual categories. Accordingly, we investigated the electrophysiological correlates of race CP in a group of adults and children between 3 and 5 years of age. For both groups we found a greater modulation of the N400 connected with the processing of between category boundaries (i.e., faces belonging to different race groups) than within-category boundaries (i.e., different faces belonging to the same race group). This effect was the same in both adults and children, as shown by the comparable between-group amplitude of the differential wave (DW) elicited by the between-category faces. Remarkably, this effect was positively correlated with racial-labels acquisition, but not with age, in children. Finally, brain source analysis revealed the activation of a more modularized cortical network in adults than in children, with unique activation of the left superior temporal gyrus (STG) and the inferior frontal gyrus (IFG), which are areas connected to language processing. These are the first results accounting for an effect of language in rewiring brain connectedness when processing racial categories

    Synchrony of Caresses: Does Affective Touch Help Infants to Detect Body-Related Visual–Tactile Synchrony?

    Get PDF
    Bodily self-awareness, that is the ability to sense and recognize our body as our own, involves the encoding and integration of a wide range of multisensory and motor signals. Infants’ abilities to detect synchrony and bind together sensory information in time and space critically contribute to the process of gradual bodily self-awareness. In particular, early tactile experiences may have a crucial role in promoting self-other differentiation and developing bodily self-awareness. More specifically affective touch, slow and gentle touch linked to the neurophysiologically specialized system of C-tactile afferents, provides both information about the body from within (interoception) and outside (exteroception), suggesting it may be a key component contributing to the experience of bodily self-awareness. The present study aimed to investigate the role of affective touch in the formation and modulation of body perception from the earliest stages of life. Using a preferential looking task, 5-month-old infants were presented with synchronous and asynchronous visuo–tactile body-related stimuli. The socio-affective valence of the tactile stimuli was manipulated by means of the velocity [CT-optimal (slow) touch vs. CT-suboptimal (fast) touch] and the source of touch (human hand vs. brush). For the first time, we show that only infants that were stroked using a brush at slow velocity displayed a preference for the visual–tactile synchronous video, suggesting that CT-optimal touch might help infants to detect body-related visual–tactile synchrony, independently from the source of touch. Our results are in line with findings from adults and indicate that affective touch might have a critical role in the early development of bodily self-awareness

    Sensorimotor Research Utilising Immersive Virtual Reality: A Pilot Study with Children and Adults with Autism Spectrum Disorders

    Get PDF
    When learning and interacting with the world, people with Autism Spectrum Disorders (ASD) show compromised use of vision and enhanced reliance on body-based information. As this atypical profile is associated with motor and social difficulties, interventions could aim to reduce the potentially isolating reliance on the body and foster the use of visual information. To this end, head-mounted displays (HMDs) have unique features that enable the design of Immersive Virtual Realities (IVR) for manipulating and training sensorimotor processing. The present study assesses feasibility and offers some early insights from a new paradigm for exploring how children and adults with ASD interact with Reality and IVR when vision and proprioception are manipulated. Seven participants (five adults, two children) performed a self-turn task in two environments (Reality and IVR) for each of three sensory conditions (Only Proprioception, Only Vision, Vision + Proprioception) in a purpose-designed testing room and an HMD-simulated environment. The pilot indicates good feasibility of the paradigm. Preliminary data visualisation suggests the importance of considering inter-individual variability. The participants in this study who performed worse with Only Vision and better with Only Proprioception seemed to benefit from the use of IVR. Those who performed better with Only Vision and worse with Only Proprioception seemed to benefit from Reality. Therefore, we invite researchers and clinicians to consider that IVR may facilitate or impair individuals depending on their profiles

    Newborns Are Sensitive to Impending Collision Within Their Peripersonal Space

    Get PDF
    Immediately after birth, newborns are introduced within a highly stimulating environment, where many objects move close to them. It would therefore be adaptive for infants to pay more attention to objects that move towards them - on a colliding pathway - and could therefore come into contact and interact with them. The present study aimed at understanding if newborns are able to discriminate between colliding vs. noncolliding trajectories. To address this issue, we measured the looking behaviour of newborns who were presented with videos of different pairings of three events: approaching objects along a colliding course, approaching objects along a non-colliding trajectory, and receding objects. Results outlined that newborns preferred looking at the approaching and colliding movement than at both the receding and the approaching but non-colliding movements. Data also suggest the possible occurrence of a configural effect when two colliding events are displayed simultaneously. Furthermore newborns appeared to look longer at movements directed towards the Peripersonal Space than at those directed away from it
    • …
    corecore