193 research outputs found

    Multisensory perception of looming and receding objects in human newborns

    Get PDF
    When newborns leave the enclosed spatial environment of the uterus and arrive in the outside world, they are faced with a new audiovisual environment of dynamic objects, actions and events both close to themselves and further away. One particular challenge concerns matching and making sense of the visual and auditory cues specifying object motion [1-5]. Previous research shows that adults prioritise the integration of auditory and visual information indicating looming (for example [2]) and that rhesus monkeys can integrate multisensory looming, but not receding, audiovisual stimuli [4]. Despite the clear adaptive value of correctly perceiving motion towards or away from the self - for defence against and physical interaction with moving objects - such a perceptual ability would clearly be undermined if newborns were unable to correctly match the auditory and visual cues to such motion. This multisensory perceptual skill has scarcely been studied in human ontogeny. Here we report that newborns only a few hours old are sensitive to matches between changes in visual size and in auditory intensity. This early multisensory competence demonstrates that, rather than being entirely na\uefve to their new audiovisual environment, newborns can make sense of the multisensory cue combinations specifying motion with respect to themselves

    Interpersonal representations of touch in somatosensory cortex are modulated by perspective

    Get PDF
    Observing others being touched activates similar brain areas as those activated when one experiences a touch oneself. Event-related potential (ERP) studies have revealed that modulation of somatosensory components by observed touch occurs within 100 ms after stimulus onset, and such vicarious effects have been taken as evidence for empathy for others' tactile experiences. In previous studies body parts have been presented from a first person perspective. This raises the question of the extent to which somatosensory activation by observed touch to body parts depends on the perspective from which the body part is observed. In this study (N = 18), we examined the modulation of somatosensory ERPs by observed touch delivered to another person's hand when viewed as if from a first person versus a third person perspective. We found that vicarious touch effects primarily consist of two separable components in the early stages of somatosensory processing: an anatomical mapping for touch in first person perspective at P45, and a specular (mirror like) mapping for touch in third person perspective at P100. This is consistent with suggestions that vicarious representations exist to support predictions for one's own bodily events, but also to enable predictions of a social or interpersonal kind, at distinct temporal stages

    Cognitive control of sequential knowledge in 2-year-olds: evidence from an incidental sequence-learning and generation-task

    Get PDF
    Thirty-eight two-year-olds were trained under incidental instructions on a six element deterministic sequence of spatial locations. Following training, participants were informed of the presence of a sequence and asked to either reproduce or suppress the learned material. Children's production of the trained sequence was modulated by these instructions. When asked to suppress the trained sequence they were able to increase generation of paths that were not from the training sequence. Their performance was thus dependent on active suppresion of knowledge rather than a random generation strategy. This degree of control in two-year-olds stands in stark contrast to 3-year-olds' failure to control explicitly instructed rule-based knowledge (as measured the Dimensional Change Card Sort Task). We suggest that this is because the incidental nature of the learning enables the acquisition of a more procedural form of knowledge with which this age-group have more experience prior to the onset of fluent language

    Constructions of diagonal quartic and sextic surfaces with infinitely many rational points

    Full text link
    In this note we construct several infinite families of diagonal quartic surfaces \begin{equation*} ax^4+by^4+cz^4+dw^4=0, \end{equation*} where a,b,c,d∈Z∖{0}a,b,c,d\in\Z\setminus\{0\} with infinitely many rational points and satisfying the condition abcd≠□abcd\neq \square. In particular, we present an infinite family of diagonal quartic surfaces defined over \Q with Picard number equal to one and possessing infinitely many rational points. Further, we present some sextic surfaces of type ax6+by6+cz6+dwi=0ax^6+by^6+cz^6+dw^i=0, i=2i=2, 33, or 66, with infinitely many rational points.Comment: revised version will appear in International Journal of Number Theor

    Visual objects approaching the body modulate subsequent somatosensory processing at 4 months of age

    Get PDF
    Abstract We asked whether, in the first year of life, the infant brain can support the dynamic crossmodal interactions between vision and somatosensation that are required to represent peripersonal space. Infants aged 4 (n = 20, 9 female) and 8 (n = 20, 10 female) months were presented with a visual object that moved towards their body or receded away from it. This was presented in the bottom half of the screen and not fixated upon by the infants, who were instead focusing on an attention getter at the top of the screen. The visual moving object then disappeared and was followed by a vibrotactile stimulus occurring later in time and in a different location in space (on their hands). The 4-month-olds’ somatosensory evoked potentials (SEPs) were enhanced when tactile stimuli were preceded by unattended approaching visual motion, demonstrating that the dynamic visual-somatosensory cortical interactions underpinning representations of the body and peripersonal space begin early in the first year of life. Within the 8-month-olds’ sample, SEPs were increasingly enhanced by (unexpected) tactile stimuli following receding visual motion as age in days increased, demonstrating changes in the neural underpinnings of the representations of peripersonal space across the first year of life

    The role of hand size in body representation:a developmental investigation

    Get PDF
    Knowledge of one’s own body size is a crucial facet of body representation, both for acting on the environment and perhaps also for constraining body ownership. However, representations of body size may be somewhat plastic, particularly to allow for physical growth in childhood. Here we report a developmental investigation into the role of hand size in body representation (the sense of body ownership, perception of hand position, and perception of own-hand size). Using the rubber hand illusion paradigm, this study used different fake hand sizes (60%, 80%, 100%, 120% or 140% of typical size) in three age groups (6- to 7-year-olds, 12- to 13-year-olds, and adults; N = 229). We found no evidence that hand size constrains ownership or position: participants embodied hands which were both larger and smaller than their own, and indeed judged their own hands to have changed size following the illusion. Children and adolescents embodied the fake hands more than adults, with a greater tendency to feel their own hand had changed size. Adolescents were particularly sensitive to multisensory information. In sum, we found substantial plasticity in the representation of own-body size, with partial support for the hypothesis that children have looser representations than adults.</p

    Sensitivity to Visual‐Tactile Colocation on the Body Prior to Skilled Reaching in Early Infancy

    Get PDF
    Two experiments examined perceptual colocation of visual and tactile stimuli in young infants. Experiment 1 compared 4‐ (n = 15) and 6‐month‐old (n = 12) infants’ visual preferences for visual‐tactile stimulus pairs presented across the same or different feet. The 4‐ and 6‐month‐olds showed, respectively, preferences for colocated and noncolocated conditions, demonstrating sensitivity to visual‐tactile colocation on their feet. This extends previous findings of visual‐tactile perceptual colocation on the hands in older infants. Control conditions excluded the possibility that both 6‐ (Experiment 1), and 4‐month‐olds (Experiment 2, n = 12) perceived colocation on the basis of an undifferentiated supramodal coding of spatial distance between stimuli. Bimodal perception of visual‐tactile colocation is available by 4 months of age, that is, prior to the development of skilled reaching

    The effect of spatial cues on infants’ responses in the AB task, with and without a hidden object.

    Get PDF
    The errors made by infants in the AB task were taken by Piaget (1954) as an indication of an inability to update their representations of the spatial location of a hidden object. This paper presents an experiment designed to further investigate the role of spatial representations in the production of the error. The introduction of strong visual cues to spatial location was found to reduce the traditional A-not-B search error. However, it also increased perseveration when a ‘lids-only’ analogue of the AB task was used, in which infants are simply cued to pick up lids, rather than encouraged to search for a hidden object. These results present a challenge to the dynamic systems account of the error given by Smith, Thelen, Titzer, and McLin (1999), and indicate that the traditional A-not-B search error arises from a difficulty in updating representations of the spatial location of hidden objects. The relation of these results to Munakata’s (1998) PDP model, and Thelen, Schöner, Scheier, and Smith’s (in press) most recent dynamic systems model of the A-not-B error is also discussed

    Cortical signatures of visual body representation develop in human infancy

    Get PDF
    Human infants cannot report their experiences, limiting what we can learn about their bodily awareness. However, visual cortical responses to the body, linked to visual awareness and selective attention in adults, can be easily measured in infants and provide a promising marker of bodily awareness in early life. We presented 4- and 8-month-old infants with a flickering (7.5 Hz) video of a hand being stroked and recorded steady-state visual evoked potentials (SSVEPs). In half of the trials, the infants also received tactile stroking synchronously with visual stroking. The 8-month-old, but not the 4-month-old infants, showed a significant enhancement of SSVEP responses when they received tactile stimulation concurrent with the visually observed stroking. Follow-up experiments showed that this enhancement did not occur when the visual hand was presented in an incompatible posture with the infant's own body or when the visual stimulus was a body-irrelevant video. Our findings provide a novel insight into the development of bodily self-awareness in the first year of life.</p
    • 

    corecore