15 research outputs found

    Head and eyes: Looking behavior in 12- to 24-month-old infants

    Get PDF
    This study demonstrates evidence for a foundational process underlying active vision in older infants during object play. Using head-mounted eye-tracking and motion capture, looks to an object are shown to be tightly linked to and synchronous with a stilled head, regardless of the duration of gaze, for infants 12 to 24 months of age. Despite being a developmental period of rapid and marked changes in motor abilities, the dynamic coordination of head stabilization and sustained gaze to a visual target is developmentally invariant during the examined age range. The findings indicate that looking with an aligned head and eyes is a fundamental property of human vision and highlights the importance of studying looking behavior in freely moving perceivers in everyday contexts, opening new questions about the role of body movement in both typical and atypical development of visual attention

    An open-source, wireless vest for measuring autonomic function in infants

    Get PDF
    Infant behavior, like all behavior, is the aggregate product of many nested processes operating and interacting over multiple time scales; the result of a tangle of inter-related causes and effects. Efforts in identifying the mechanisms supporting infant behavior require the development and advancement of new technologies that can accurately and densely capture behavior's multiple branches. The present study describes an open-source, wireless autonomic vest specifically designed for use in infants 8-24 months of age in order to measure cardiac activity, respiration, and movement. The schematics of the vest, instructions for its construction, and a suite of software designed for its use are made freely available. While the use of such autonomic measures has many applications across the field of developmental psychology, the present article will present evidence for the validity of the vest in three ways: (1) by demonstrating known clinical landmarks of a heartbeat, (2) by demonstrating an infant in a period of sustained attention, a well-documented behavior in the developmental psychology literature, and (3) relating changes in accelerometer output to infant behavior

    Developmentally Changing Attractor Dynamics of Manual Actions with Objects in Late Infancy

    No full text
    Human infants interact with the environment through a growing and changing body and their manual actions provide new opportunities for exploration and learning. In the current study, a dynamical systems approach was used to quantify and characterize the early motor development of limb effectors during bouts of manual activity. Many contemporary theories of motor development emphasize sources of order in movement over developmental time. However, little is known about the dynamics of manual actions during the first two years of life, a period of development with dramatic anatomical changes resulting in new opportunities for action. Here, we introduce a novel analytical protocol for estimating properties of attractor regions using motion capture. We apply this new analysis to a longitudinal corpus of manual actions during sessions of toy play across the first two years of life. Our results suggest that the size of attractor regions for manual actions increases across development and that infants spend more time inside the attractor region of their movements during bouts of manual actions with objects. The sources of order in manual actions are discussed in terms of changing attractor dynamics across development

    Eye-gaze and arrow cues influence elementary sound perception

    No full text
    We report a novel effect in which the visual perception of eye-gaze and arrow cues change the way we perceive sound. In our experiments, subjects first saw an arrow or gazing face, and then heard a brief sound originating from one of six locations. Perceived sound origins were shifted in the direction indicated by the arrows or eye-gaze. This perceptual shift was equivalent for both arrows and gazing faces and was unaffected by facial expression, consistent with a generic, supramodal attentional influence by exogenous cues

    Data from: Vocal and locomotor coordination develops in association with the autonomic nervous system

    No full text
    In adult animals, movement and vocalizations are coordinated, sometimes facilitating, and at other times inhibiting, each other. What is missing is how these different domains of motor control become coordinated over the course of development. We investigated how postural-locomotor behaviors may influence vocal development, and the role played by physiological arousal during their interactions. Using infant marmoset monkeys, we densely sampled vocal, postural and locomotor behaviors and estimated arousal fluctuations from electrocardiographic measures of heart rate. We found that vocalizations matured sooner than postural and locomotor skills, and that vocal-locomotor coordination improved with age and during elevated arousal levels. These results suggest that postural-locomotor maturity is not required for vocal development to occur, and that infants gradually improve coordination between vocalizations and body movement through a process that may be facilitated by arousal level changes

    A View of Their Own: Capturing the Egocentric View of Infants and Toddlers with Head-Mounted Cameras

    No full text
    Infants and toddlers view the world, at a basic sensory level, in a fundamentally different way from their parents. This is largely due to biological constraints: infants possess different body proportions than their parents and the ability to control their own head movements is less developed. Such constraints limit the visual input available. This protocol aims to provide guiding principles for researchers using head-mounted cameras to understand the changing visual input experienced by the developing infant. Successful use of this protocol will allow researchers to design and execute studies of the developing child's visual environment set in the home or laboratory. From this method, researchers can compile an aggregate view of all the possible items in a child's field of view. This method does not directly measure exactly what the child is looking at. By combining this approach with machine learning, computer vision algorithms, and hand-coding, researchers can produce a high-density dataset to illustrate the changing visual ecology of the developing infant
    corecore