430,323 research outputs found

    Gesture and speech integration: an exploratory study of a man with aphasia

    Get PDF
    Background: In order to fully comprehend a speaker’s intention in everyday communication, we integrate information from multiple sources including gesture and speech. There are no published studies that have explored the impact of aphasia on iconic co-speech gesture and speech integration. Aims: To explore the impact of aphasia on co-speech gesture and speech integration in one participant with aphasia (SR) and 20 age-matched control participants. Methods & Procedures: SR and 20 control participants watched video vignettes of people producing 21 verb phrases in 3 different conditions, verbal only (V), gesture only (G) and verbal gesture combined (VG). Participants were required to select a corresponding picture from one of four alternatives: integration target, a verbal only match, a gesture only match, and an unrelated foil. The probability of choosing the integration target in the VG that goes beyond what is expected from the probabilities of choosing the integration target in V and G was referred to as multi-modal gain(MMG). Outcomes & Results: SR obtained a significantly lower multi-modal gain score than the control participants (p<0.05). Error analysis indicated that in speech and gesture integration tasks, SR relied on gesture in order to decode the message, whereas the control participants relied on speech in order to decode the message. Further analysis of the speech only and gesture only tasks indicated SR had intact gesture comprehension but impaired spoken word comprehension. Conclusions & Implications: The results confirm findings by Records (1994) which reported that impaired verbal comprehension leads to a greater reliance on gesture to decode messages. Moreover, multi-modal integration of information from speech and iconic gesture can be impaired in aphasia. The findings highlight the need for further exploration of the impact of aphasia on gesture and speech integration

    Eigengestures for natural human computer interface

    Full text link
    We present the application of Principal Component Analysis for data acquired during the design of a natural gesture interface. We investigate the concept of an eigengesture for motion capture hand gesture data and present the visualisation of principal components obtained in the course of conducted experiments. We also show the influence of dimensionality reduction on reconstructed gesture data quality.Comment: 10 pages, 3 figure

    A color hand gesture database for evaluating and improving algorithms on hand gesture and posture recognition

    Get PDF
    With the increase of research activities in vision-based hand posture and gesture recognition, new methods and algorithms are being developed. Although less attention is being paid to developing a standard platform for this purpose. Developing a database of hand gesture images is a necessary first step for standardizing the research on hand gesture recognition. For this purpose, we have developed an image database of hand posture and gesture images. The database contains hand images in different lighting conditions and collected using a digital camera. Details of the automatic segmentation and clipping of the hands are also discussed in this paper

    Prototyping a Capacitive Sensing Device for Gesture Recognition

    Get PDF
    Capacitive sensing is a technology that can detect proximity and touch. It can also be utilized to measure position and acceleration of gesture motions. This technology has many applications, such as replacing mechanical buttons in a gaming device interface, detecting respiration rate without direct contact with the skin, and providing gesture sensing capability for rehabilitation devices. In this thesis, an approach to prototype a capacitive gesture sensing device using the Eagle PCB design software is demonstrated. In addition, this paper tested and evaluated the resulting prototype device, validating the effectiveness of the approach
    corecore