1,315 research outputs found

    Maternal Responses to Communication of Infants at Low and Heightened Risk for Autism

    Get PDF
    The current study investigated maternal responses to infant communication among mothers of infants at heightened risk (HR) for autism spectrum disorder (ASD) and mothers of infants at low risk (LR) for the disorder at 13 and 18 months. Infants and mothers were observed during naturalistic in-home interactions and semi-structured play. By 18 months, HR infants demonstrated delays in developmentally advanced communicative behaviors (pointing, showing, words) as compared to LR infants. Regarding maternal responses, overall mothers of HR infants responded as frequently as mothers of LR infants. However, our data indicated that from 13 to 18 months, mothers of LR infants increased their responsiveness to non-word vocalizations while mothers of HR infants did not. In addition, mothers of both HR and LR infants were more likely to label the referent of developmentally advanced gestures (pointing/showing) than earlier emerging gestures (giving/requesting). These findings suggest that mothers may provide richer responses to more developmentally advanced communication. Thus, delays in infant gesture and speech could alter the input infants receive leading to potential cascading effects on language development

    Multimodal-first or pantomime-first?

    Get PDF
    A persistent controversy in language evolution research has been whether language emerged in the gestural-visual or in the vocal-auditory modality. A “dialectic” solution to this age-old debate has now been gaining ground: language was fully multimodal from the start, and remains so to this day. In this paper, we show this solution to be too simplistic and outline a more specific theoretical proposal, which we designate as pantomime-first. To decide between the multimodal-first and pantomime-first alternatives, we review several lines of interdisciplinary evidence and complement it with a cognitive-semiotic experiment. In the study, the participants saw – and then matched to hand-drawn images – recordings of short transitive events enacted by 4 actors in two conditions: visual (only body movement), and multimodal (body movement accompanied by nonlinguistic vocalization). Significantly, the matching accuracy was greater in the visual than the multimodal condition, though a follow-up experiment revealed that the emotional profiles of the events enacted in the multimodal condition could be reliably detected from the sound alone. We see these results as supporting the proposed pantomime-first scenari

    THE DEVELOPMENT OF MULTIMODAL SOCIAL COMMUNICATION IN INFANTS AT HIGH RISK FOR AUTISM SPECTRUM DISORDERS

    Get PDF
    In addition to impairments in gaze, facial expression, gesture, and sound, children with autism spectrum disorders (ASD) have difficulty producing these behaviors in coordination. Two studies were designed to evaluate the extent to which delayed and/or atypical development in the production or coordination of social communication behaviors can identify children eventually diagnosed with ASD. This research was grounded in Dynamic Systems Theory (DST), which proposes that changes in development depend on the interaction of multiple subsystems within the child, the environment, and the demands of the task; and that instability in one component can translate into varied developmental courses. A prospective longitudinal design was used to compare 9 infants at high familial risk for ASD (HR) later diagnosed with ASD, with 13 HR infants with language delay, 28 HR infants with no diagnosis, and 30 low risk (LR) infants. Participants were observed at home during naturalistic play with a primary caregiver at 8, 10, 12, 14, and 18 months. Frequencies of gestures, words, non-word vocalizations, eye contact, and smiles, and instances in which behaviors overlapped in time, were coded from videotape. Study 1 revealed that, while all infants demonstrated similar levels of communicative behavior at 8 months, ASD infants exhibited significantly slower growth in coordinations involving pre-speech vocalizations and those involving gestures used for joint attention than all other infants, even those exhibiting language delays. Study 2 demonstrated that information gathered on social communication skills during a natural setting improved prediction of diagnostic outcome when combined with standardized assessments and parent report; and the setting, method of measurement, and frequency of assessment were important factors in determining risk. Across both studies, variability was detected between and within infants. Results suggest that behavioral signs of ASD emerge over time in specific areas of communication. Disruption in the coordination of pre-speech vocalizations may result in negative cascading effects that have important implications for later social and linguistic development. Findings emphasize the importance of examining a wide range of communicative behavior in HR infants across contexts repeatedly over time and that DST offers a valuable framework with which to better understand their development

    Vocal-auditory feedback and the modality transition problem in language evolution

    Get PDF
    This is a pre-print version. This article has been published in Reti Saperi Linguaggi, 1/2016 a. 5 (9), 157–178, [DOI: 10.12832/83923]. Copyright Società editrice il Mulino. The publisher should be contacted for permission to re-use or reprint the material in any form.The gestural theories, which see the origins of (proto)linguistic communication not in vocalization but rather in manual gesture, have come to take center stage in today’s academic reflection on the roots of language. The gestural theories, however, suffer from a near-fatal problem of the so-called «modality switch», i.e. of how and why language could have transferred from the mostly-visual to the mostly-vocal form that it now has in human societies almost universally. In our paper, we offer a potential and partial solution to this problem. We take as our starting point a gestural scenario on which emerging language-like communication involves orofacial gestures, and we complement such a scenario with the inclusion of vocal-auditory feedback, which aids signal production. Such benefits of more articulatory precision that accrue to the signal producersmight have constituted one reason behind supplementing orofacial gestures with sound and so increasing the role of vocalization in the emerging (proto)language

    Origins of vocal-entangled gesture

    Get PDF
    Gestures during speaking are typically understood in a representational framework: they represent absent or distal states of affairs by means of pointing, resemblance, or symbolic replacement. However, humans also gesture along with the rhythm of speaking, which is amenable to a non-representational perspective. Such a perspective centers on the phenomenon of vocal-entangled gestures and builds on evidence showing that when an upper limb with a certain mass decelerates/accelerates sufficiently, it yields impulses on the body that cascade in various ways into the respiratory–vocal system. It entails a physical entanglement between body motions, respiration, and vocal activities. It is shown that vocal-entangled gestures are realized in infant vocal–motor babbling before any representational use of gesture develops. Similarly, an overview is given of vocal-entangled processes in non-human animals. They can frequently be found in rats, bats, birds, and a range of other species that developed even earlier in the phylogenetic tree. Thus, the origins of human gesture lie in biomechanics, emerging early in ontogeny and running deep in phylogeny

    Visible movements of the orofacial area: evidence for gestural or multimodal theories of language evolution?

    Get PDF
    The age-old debate between the proponents of the gesture-first and speech-first positions has returned to occupy a central place in current language evolution theorizing. The gestural scenarios, suffering from the problem known as “modality transition” (why a gestural system would have changed into a predominantly spoken system), frequently appeal to the gestures of the orofacial area as a platform for this putative transition. Here, we review currently available evidence on the significance of the orofacial area in language evolution. While our review offers some support for orofacial movements as an evolutionary “bridge” between manual gesture and speech, we see the evidence as far more consistent with a multimodal approach. We also suggest that, more generally, the “gestural versus spoken” formulation is limiting and would be better expressed in terms of the relative input and interplay of the visual and vocal-auditory sensory modalities

    Therapeutic and educational objectives in robot assisted play for children with autism

    Get PDF
    “This material is presented to ensure timely dissemination of scholarly and technical work. Copyright and all rights therein are retained by authors or by other copyright holders. All persons copying this information are expected to adhere to the terms and constraints invoked by each author's copyright. In most cases, these works may not be reposted without the explicit permission of the copyright holder." “Copyright IEEE. Personal use of this material is permitted. However, permission to reprint/republish this material for advertising or promotional purposes or for creating new collective works for resale or redistribution to servers or lists, or to reuse any copyrighted component of this work in other works must be obtained from the IEEE.” DOI: 10.1109/ROMAN.2009.5326251This article is a methodological paper that describes the therapeutic and educational objectives that were identified during the design process of a robot aimed at robot assisted play. The work described in this paper is part of the IROMEC project (Interactive Robotic Social Mediators as Companions) that recognizes the important role of play in child development and targets children who are prevented from or inhibited in playing. The project investigates the role of an interactive, autonomous robotic toy in therapy and education for children with special needs. This paper specifically addresses the therapeutic and educational objectives related to children with autism. In recent years, robots have already been used to teach basic social interaction skills to children with autism. The added value of the IROMEC robot is that play scenarios have been developed taking children's specific strengths and needs into consideration and covering a wide range of objectives in children's development areas (sensory, communicational and interaction, motor, cognitive and social and emotional). The paper describes children's developmental areas and illustrates how different experiences and interactions with the IROMEC robot are designed to target objectives in these areas

    Therapeutic and educational objectives in robot assisted play for children with autism

    Get PDF
    “This material is presented to ensure timely dissemination of scholarly and technical work. Copyright and all rights therein are retained by authors or by other copyright holders. All persons copying this information are expected to adhere to the terms and constraints invoked by each author's copyright. In most cases, these works may not be reposted without the explicit permission of the copyright holder." “Copyright IEEE. Personal use of this material is permitted. However, permission to reprint/republish this material for advertising or promotional purposes or for creating new collective works for resale or redistribution to servers or lists, or to reuse any copyrighted component of this work in other works must be obtained from the IEEE.” DOI: 10.1109/ROMAN.2009.5326251This article is a methodological paper that describes the therapeutic and educational objectives that were identified during the design process of a robot aimed at robot assisted play. The work described in this paper is part of the IROMEC project (Interactive Robotic Social Mediators as Companions) that recognizes the important role of play in child development and targets children who are prevented from or inhibited in playing. The project investigates the role of an interactive, autonomous robotic toy in therapy and education for children with special needs. This paper specifically addresses the therapeutic and educational objectives related to children with autism. In recent years, robots have already been used to teach basic social interaction skills to children with autism. The added value of the IROMEC robot is that play scenarios have been developed taking children's specific strengths and needs into consideration and covering a wide range of objectives in children's development areas (sensory, communicational and interaction, motor, cognitive and social and emotional). The paper describes children's developmental areas and illustrates how different experiences and interactions with the IROMEC robot are designed to target objectives in these areas.Final Published versio

    Physical mechanisms may be as important as brain mechanisms in evolution of speech [Commentary on Ackerman, Hage, & Ziegler. Brain Mechanisms of acoustic communication in humans and nonhuman primates: an evolutionary perspective]

    No full text
    We present two arguments why physical adaptations for vocalization may be as important as neural adaptations. First, fine control over vocalization is not easy for physical reasons, and modern humans may be exceptional. Second, we present an example of a gorilla that shows rudimentary voluntary control over vocalization, indicating that some neural control is already shared with great apes
    • …
    corecore