5,614 research outputs found

    The Pragmatics of Person and Imperatives in Sign Language of the Netherlands

    Get PDF
    We present new evidence against a grammatical distinction between second and third person in Sign Language of The Netherlands (NGT). More precisely, we show how pushing this distinction into the domain of pragmatics helps account for an otherwise puzzling fact about the NGT imperative: not only is it used to command your addressee, it can also express ‘non-addressee-oriented commands’

    Natural interaction with a virtual guide in a virtual environment: A multimodal dialogue system

    Get PDF
    This paper describes the Virtual Guide, a multimodal dialogue system represented by an embodied conversational agent that can help users to find their way in a virtual environment, while adapting its affective linguistic style to that of the user. We discuss the modular architecture of the system, and describe the entire loop from multimodal input analysis to multimodal output generation. We also describe how the Virtual Guide detects the level of politeness of the user’s utterances in real-time during the dialogue and aligns its own language to that of the user, using different politeness strategies. Finally we report on our first user tests, and discuss some potential extensions to improve the system

    Structuring information through gesture and intonation

    Get PDF
    Face-to-face communication is multimodal. In unscripted spoken discourse we can observe the interaction of several “semiotic layers”, modalities of information such as syntax, discourse structure, gesture, and intonation. We explore the role of gesture and intonation in structuring and aligning information in spoken discourse through a study of the co-occurrence of pitch accents and gestural apices. Metaphorical spatialization through gesture also plays a role in conveying the contextual relationships between the speaker, the government and other external forces in a naturally-occurring political speech setting

    Facilitating joint attention with salient pointing in interactions involving children with autism spectrum disorder

    Get PDF
    Children with autism spectrum disorder (ASD) reportedly have difficulties in responding to bids for joint attention, notably in following pointing gestures. Previous studies have predominantly built on structured observation measures and predefined coding categories to measure children’s responsiveness to gestures. However, how these gestures are designed and what detailed interactional work they can accomplish have received less attention. In this paper, we use a multimodal approach to conversation analysis (CA) to investigate how educators design their use of pointing in interactions involving school-aged children with ASD or autistic features. The analysis shows that pointing had specific sequential implications for the children beyond mere attention sharing. Occasionally, the co-occurring talk and pointing led to ambiguities when a child was interpreting their interactional connotations, specifically when the pointing gesture lacked salience. The study demonstrates that the CA approach can increase understanding of how to facilitate the establishment of joint attention

    How hand movements and speech tip the balance in cognitive development:A story about children, complexity, coordination, and affordances

    Get PDF
    When someone asks us to explain something, such as how a lever or balance scale works, we spontaneously move our hands and gesture. This is also true for children. Furthermore, children use their hands to discover things and to find out how something works. Previous research has shown that children’s hand movements hereby are ahead of speech, and play a leading role in cognitive development. Explanations for this assumed that cognitive understanding takes place in one’s head, and that hand movements and speech (only) reflect this. However, cognitive understanding arises and consists of the constant interplay between (hand) movements and speech, and someone’s physical and social environment. The physical environment includes task properties, for example, and the social environment includes other people. Therefore, I focused on this constant interplay between hand movements, speech, and the environment, to better understand hand movements’ role in cognitive development. Using science and technology tasks, we found that children’s speech affects hand movements more than the other way around. During difficult tasks the coupling between hand movements and speech becomes even stronger than in easy tasks. Interim changes in task properties differently affect hand movements and speech. Collaborating children coordinate their hand movements and speech, and even their head movements together. The coupling between hand movements and speech is related to age and (school) performance. It is important that teachers attend to children’s hand movements and speech, and arrange their lessons and classrooms such that there is room for both

    A systematic investigation of gesture kinematics in evolving manual languages in the lab

    Get PDF
    Item does not contain fulltextSilent gestures consist of complex multi-articulatory movements but are now primarily studied through categorical coding of the referential gesture content. The relation of categorical linguistic content with continuous kinematics is therefore poorly understood. Here, we reanalyzed the video data from a gestural evolution experiment (Motamedi, Schouwstra, Smith, Culbertson, & Kirby, 2019), which showed increases in the systematicity of gesture content over time. We applied computer vision techniques to quantify the kinematics of the original data. Our kinematic analyses demonstrated that gestures become more efficient and less complex in their kinematics over generations of learners. We further detect the systematicity of gesture form on the level of thegesture kinematic interrelations, which directly scales with the systematicity obtained on semantic coding of the gestures. Thus, from continuous kinematics alone, we can tap into linguistic aspects that were previously only approachable through categorical coding of meaning. Finally, going beyond issues of systematicity, we show how unique gesture kinematic dialects emerged over generations as isolated chains of participants gradually diverged over iterations from other chains. We, thereby, conclude that gestures can come to embody the linguistic system at the level of interrelationships between communicative tokens, which should calibrate our theories about form and linguistic content.29 p

    A Hierarchy of Phonetic Constraints on Palatality in Russian

    Get PDF
    • 

    corecore