27 research outputs found

    Neurophysiological evidence for rapid processing of verbal and gestural information in understanding communicative actions

    Get PDF
    During everyday social interaction, gestures are a fundamental part of human communication. The communicative pragmatic role of hand gestures and their interaction with spoken language has been documented at the earliest stage of language development, in which two types of indexical gestures are most prominent: the pointing gesture for directing attention to objects and the give-me gesture for making requests. Here we study, in adult human participants, the neurophysiological signatures of gestural-linguistic acts of communicating the pragmatic intentions of naming and requesting by simultaneously presenting written words and gestures. Already at ~150 ms, brain responses diverged between naming and request actions expressed by word-gesture combination, whereas the same gestures presented in isolation elicited their earliest neurophysiological dissociations significantly later (at ~210 ms). There was an early enhancement of request-evoked brain activity as compared with naming, which was due to sources in the frontocentral cortex, consistent with access to action knowledge in request understanding. In addition, an enhanced N400-like response indicated late semantic integration of gesture-language interaction. The present study demonstrates that word-gesture combinations used to express communicative pragmatic intentions speed up the brain correlates of comprehension processes – compared with gesture-only understanding – thereby calling into question current serial linguistic models viewing pragmatic function decoding at the end of a language comprehension cascade. Instead, information about the social-interactive role of communicative acts is processed instantaneously

    How What We See and What We Know Influence Iconic Gesture Production

    Get PDF
    In face-to-face communication, speakers typically integrate information acquired through different sources, including what they see and what they know, into their communicative messages. In this study, we asked how these different input sources influence the frequency and type of iconic gestures produced by speakers during a communication task, under two degrees of task complexity. Specifically, we investigated whether speakers gestured differently when they had to describe an object presented to them as an image or as a written word (input modality) and, additionally, when they were allowed to explicitly name the object or not (task complexity). Our results show that speakers produced more gestures when they attended to a picture. Further, speakers more often gesturally depicted shape information when attended to an image, and they demonstrated the function of an object more often when they attended to a word. However, when we increased the complexity of the task by forbidding speakers to name the target objects, these patterns disappeared, suggesting that speakers may have strategically adapted their use of iconic strategies to better meet the task’s goals. Our study also revealed (independent) effects of object manipulability on the type of gestures produced by speakers and, in general, it highlighted a predominance of molding and handling gestures. These gestures may reflect stronger motoric and haptic simulations, lending support to activation-based gesture production accounts

    A gestural repertoire of 1-2year old human children : in search of the ape gestures

    Get PDF
    This project was made possible with the generous financial help of the Baverstock Bequest to the Psychology and Neuroscience Department at the University of St Andrews.When we compare human gestures to those of other apes, it looks at first like there is nothing much to compare at all. In adult humans, gestures are thought to be a window into the thought processes accompanying language, and sign languages are equal to spoken language with all of its features. While some research firmly emphasises the difference between human gestures and those of other apes, the question about whether there are any commonalities has rarely been investigated, and is mostly confined to pointing gestures. The gestural repertoires of nonhuman ape species have been carefully studied and described with regard to their form and function – but similar approaches are much rarer in the study of human gestures. This paper applies the methodology commonly used in the study of nonhuman ape gestures to the gestural communication of human children in their second year of life. We recorded (n=13) children’s gestures in a natural setting with peers and caregivers in Germany and Uganda. Children employed 52 distinct gestures, 46 (89%) of which are present in the chimpanzee repertoire. Like chimpanzees, they used them both singly, and in sequences; and employed individual gestures flexibly towards different goals.Publisher PDFPeer reviewe

    Do Parents Model Gestures Differently When Children's Gestures Differ?

    No full text
    Children with autism spectrum disorder (ASD) or with Down syndrome (DS) show diagnosis-specific differences from typically developing (TD) children in gesture production. We asked whether these differences reflect the differences in parental gesture input. Our systematic observations of 23 children with ASD and 23 with DS (M <sub>ages</sub>  = 2;6)-compared to 23 TD children (M <sub>age</sub>  = 1;6) similar in expressive vocabulary-showed that across groups children and parents produced similar types of gestures and gesture-speech combinations. However, only children-but not their parents-showed diagnosis-specific variability in how often they produced each type of gesture and gesture-speech combination. These findings suggest that, even though parents model gestures similarly, the amount with which children produce each type largely reflects diagnosis-specific abilities
    corecore