75 research outputs found

    Simultaneity as an emergent property of efficient communication in language: A comparison of silent gesture and sign language

    Get PDF
    Sign languages use multiple articulators and iconicity in the visual modality which allow linguistic units to be organized not only linearly but also simultaneously. Recent research has shown that users of an established sign language such as LIS (Italian Sign Language) use simultaneous and iconic constructions as a modality-specific resource to achieve communicative efficiency when they are required to encode informationally rich events. However, it remains to be explored whether the use of such simultaneous and iconic constructions recruited for communicative efficiency can be employed even without a linguistic system (i.e., in silent gesture) or whether they are specific to linguistic patterning (i.e., in LIS). In the present study, we conducted the same experiment as in Slonimska et al. with 23 Italian speakers using silent gesture and compared the results of the two studies. The findings showed that while simultaneity was afforded by the visual modality to some extent, its use in silent gesture was nevertheless less frequent and qualitatively different than when used within a linguistic system. Thus, the use of simultaneous and iconic constructions for communicative efficiency constitutes an emergent property of sign languages. The present study highlights the importance of studying modality-specific resources and their use for linguistic expression in order to promote a more thorough understanding of the language faculty and its modality-specific adaptive capabilities

    Cross-linguistic views of gesture usage

    Get PDF
    People have stereotypes about gesture usage. For instance, speakers in East Asia are not supposed to gesticulate, and it is believed that Italians gesticulate more than the British. Despite the prevalence of such views, studies that investigate these stereotypes are scarce. The present study examined peopleÕs views on spontaneous gestures by collecting data from five different countries. A total of 363 undergraduate students from five countries (France, Italy, Japan, the Netherlands and USA) participated in this study. Data were collected through a two-part questionnaire. Part 1 asked participants to rate two characteristics of gesture: frequency and size of gesture for 13 different languages. Part 2 asked them about their views on factors that might affect the production of gestures. The results showed that most participants in this study believe that Italian, Spanish, and American English speakers produce larger gestures more frequently than other language speakers. They also showed that each culture group, even within Europe, put weight on a slightly different aspect of gestures

    Developmental language disorder: Early predictors, age for the diagnosis, and diagnostic tools. A scoping review

    Get PDF
    Background. Developmental Language Disorder (DLD) is frequent in childhood and may have long-term sequelae. By employing an evidence-based approach, this scoping review aims at identifying (a) early predictors of DLD; (b) the optimal age range for the use of screening and diagnostic tools; (c) effective diagnostic tools in preschool children. Methods. We considered systematic reviews, meta-analyses, and primary observational studies with control groups on predictive, sensitivity and specificity values of screening and diagnostic tools and psycholinguistic measures for the assessment of DLD in preschool children. We identified 37 studies, consisting of 10 systematic reviews and 27 primary studies. Results. Delay in gesture production, receptive and/or expressive vocabulary, syntactic comprehension, or word combination up to 30 months emerged as early predictors of DLD, a family history of DLD appeared to be a major risk factor, and low socioeconomic status and environmental input were reported as risk factors with lower predictive power. Optimal time for screening is suggested between age 2 and 3, for diagnosis around age 4. Because of the high variability of sensitivity and specificity values, joint use of standardized and psycholinguistic measures is suggested to increase diagnostic accuracy. Conclusions. Monitoring risk situations and employing caregivers\u2019 reports, clinical assessment and multiple linguistic measures are fundamental for an early identification of DLD and timely interventions

    Neurophysiological evidence for rapid processing of verbal and gestural information in understanding communicative actions

    Get PDF
    During everyday social interaction, gestures are a fundamental part of human communication. The communicative pragmatic role of hand gestures and their interaction with spoken language has been documented at the earliest stage of language development, in which two types of indexical gestures are most prominent: the pointing gesture for directing attention to objects and the give-me gesture for making requests. Here we study, in adult human participants, the neurophysiological signatures of gestural-linguistic acts of communicating the pragmatic intentions of naming and requesting by simultaneously presenting written words and gestures. Already at ~150 ms, brain responses diverged between naming and request actions expressed by word-gesture combination, whereas the same gestures presented in isolation elicited their earliest neurophysiological dissociations significantly later (at ~210 ms). There was an early enhancement of request-evoked brain activity as compared with naming, which was due to sources in the frontocentral cortex, consistent with access to action knowledge in request understanding. In addition, an enhanced N400-like response indicated late semantic integration of gesture-language interaction. The present study demonstrates that word-gesture combinations used to express communicative pragmatic intentions speed up the brain correlates of comprehension processes – compared with gesture-only understanding – thereby calling into question current serial linguistic models viewing pragmatic function decoding at the end of a language comprehension cascade. Instead, information about the social-interactive role of communicative acts is processed instantaneously

    Language experience impacts brain activation for spoken and signed language in infancy: Insights from unimodal and bimodal bilinguals

    Get PDF
    Recent neuroimaging studies suggest that monolingual infants activate a left lateralised fronto-temporal brain network in response to spoken language, which is similar to the network involved in processing spoken and signed language in adulthood. However, it is unclear how brain activation to language is influenced by early experience in infancy. To address this question, we present functional near infrared spectroscopy (fNIRS) data from 60 hearing infants (4-to-8 months): 19 monolingual infants exposed to English, 20 unimodal bilingual infants exposed to two spoken languages, and 21 bimodal bilingual infants exposed to English and British Sign Language (BSL). Across all infants, spoken language elicited activation in a bilateral brain network including the inferior frontal and posterior temporal areas, while sign language elicited activation in the right temporo-parietal area. A significant difference in brain lateralisation was observed between groups. Activation in the posterior temporal region was not lateralised in monolinguals and bimodal bilinguals, but right lateralised in response to both language modalities in unimodal bilinguals. This suggests that experience of two spoken languages influences brain activation for sign language when experienced for the first time. Multivariate pattern analyses (MVPA) could classify distributed patterns of activation within the left hemisphere for spoken and signed language in monolinguals (proportion correct = 0.68; p = 0.039) but not in unimodal or bimodal bilinguals. These results suggest that bilingual experience in infancy influences brain activation for language, and that unimodal bilingual experience has greater impact on early brain lateralisation than bimodal bilingual experience

    A gestural repertoire of 1-2year old human children : in search of the ape gestures

    Get PDF
    This project was made possible with the generous financial help of the Baverstock Bequest to the Psychology and Neuroscience Department at the University of St Andrews.When we compare human gestures to those of other apes, it looks at first like there is nothing much to compare at all. In adult humans, gestures are thought to be a window into the thought processes accompanying language, and sign languages are equal to spoken language with all of its features. While some research firmly emphasises the difference between human gestures and those of other apes, the question about whether there are any commonalities has rarely been investigated, and is mostly confined to pointing gestures. The gestural repertoires of nonhuman ape species have been carefully studied and described with regard to their form and function – but similar approaches are much rarer in the study of human gestures. This paper applies the methodology commonly used in the study of nonhuman ape gestures to the gestural communication of human children in their second year of life. We recorded (n=13) children’s gestures in a natural setting with peers and caregivers in Germany and Uganda. Children employed 52 distinct gestures, 46 (89%) of which are present in the chimpanzee repertoire. Like chimpanzees, they used them both singly, and in sequences; and employed individual gestures flexibly towards different goals.Publisher PDFPeer reviewe
    corecore