10 research outputs found

    Neurophysiological evidence for rapid processing of verbal and gestural information in understanding communicative actions

    Get PDF
    During everyday social interaction, gestures are a fundamental part of human communication. The communicative pragmatic role of hand gestures and their interaction with spoken language has been documented at the earliest stage of language development, in which two types of indexical gestures are most prominent: the pointing gesture for directing attention to objects and the give-me gesture for making requests. Here we study, in adult human participants, the neurophysiological signatures of gestural-linguistic acts of communicating the pragmatic intentions of naming and requesting by simultaneously presenting written words and gestures. Already at ~150 ms, brain responses diverged between naming and request actions expressed by word-gesture combination, whereas the same gestures presented in isolation elicited their earliest neurophysiological dissociations significantly later (at ~210 ms). There was an early enhancement of request-evoked brain activity as compared with naming, which was due to sources in the frontocentral cortex, consistent with access to action knowledge in request understanding. In addition, an enhanced N400-like response indicated late semantic integration of gesture-language interaction. The present study demonstrates that word-gesture combinations used to express communicative pragmatic intentions speed up the brain correlates of comprehension processes – compared with gesture-only understanding – thereby calling into question current serial linguistic models viewing pragmatic function decoding at the end of a language comprehension cascade. Instead, information about the social-interactive role of communicative acts is processed instantaneously

    Oscillatory Dynamics Supporting Semantic Cognition: MEG Evidence for the Contribution of the Anterior Temporal Lobe Hub and Modality-Specific Spokes

    Get PDF
    The "hub and spoke model" of semantic representation suggests that the multimodal features of objects are drawn together by an anterior temporal lobe (ATL) "hub", while modality-specific "spokes" capture perceptual/action features. However, relatively little is known about how these components are recruited through time to support object identification. We used magnetoencephalography to measure neural oscillations within left ATL, lateral fusiform cortex (FC) and central sulcus (CS) during word-picture matching at different levels of specificity (employing superordinate vs. specific labels) for different categories (manmade vs. animal). This allowed us to determine (i) when each site was sensitive to semantic category and (ii) whether this was modulated by task demands. In ATL, there were two phases of response: from around 100 ms post-stimulus there were phasic bursts of low gamma activity resulting in reductions in oscillatory power, relative to a baseline period, that were modulated by both category and specificity; this was followed by more sustained power decreases across frequency bands from 250 ms onwards. In the spokes, initial power increases were not stronger for specific identification, while later power decreases were stronger for specific-level identification in FC for animals and in CS for manmade objects (from around 150 ms and 200 ms, respectively). These data are inconsistent with a temporal sequence in which early sensory-motor activity is followed by later retrieval in ATL. Instead, knowledge emerges from the rapid recruitment of both hub and spokes, with early specificity and category effects in the ATL hub. The balance between these components depends on semantic category and task, with visual cortex playing a greater role in the fine-grained identification of animals and motor cortex contributing to the identification of tools

    Oscillatory Dynamics Supporting Semantic Cognition: MEG Evidence for the Contribution of the Anterior Temporal Lobe Hub and Modality-Specific Spokes

    No full text
    corecore