4 research outputs found

    Spotting Agreement and Disagreement: A Survey of Nonverbal Audiovisual Cues and Tools

    Get PDF
    While detecting and interpreting temporal patterns of non–verbal behavioral cues in a given context is a natural and often unconscious process for humans, it remains a rather difficult task for computer systems. Nevertheless, it is an important one to achieve if the goal is to realise a naturalistic communication between humans and machines. Machines that are able to sense social attitudes like agreement and disagreement and respond to them in a meaningful way are likely to be welcomed by users due to the more natural, efficient and human–centered interaction they are bound to experience. This paper surveys the nonverbal cues that could be present during agreement and disagreement behavioural displays and lists a number of tools that could be useful in detecting them, as well as a few publicly available databases that could be used to train these tools for analysis of spontaneous, audiovisual instances of agreement and disagreement

    A Real-Time Hand Gesture Interface Implemented on a Multi-Core Processor

    No full text
    This paper describes a real-time hand gesture recognition system and its application to VCR remote control. Cascaded classifiers are used to detect a number of different hand poses. In order to detect a hand in real time, the detection algorithm is optimized for multi-core processors by distributing the operations to multiple cores and minimizing the data transmission between them. We have implemented a detection system on a processor with eight cores. Further we have integrated the system into a prototype video recorder simulator to evaluate a gesture interface for consumer electronics. The operating speed increases by a factor of up to 13.5 compared to a standard PC with single-core processor. 1

    Gesture Based Interface for Asynchronous Video Communication for Deaf People in South Africa

    Get PDF
    The preferred method of communication amongst Deaf people is that of sign language. There are problems with the video quality when using the real-time video communication available on mobile phones. The alternative is to use text-based communication on mobile phones, however findings from other research studies show that Deaf people prefer using sign language to communicate with each other rather than text. This dissertation looks at implementing a gesture-based interface for an asynchronous video communication for Deaf people. The gesture interface was implemented on a store and forward video architecture since this preserves the video quality even when there is low bandwidth. In this dissertation three gesture-based video communication prototypes were designed and implemented using a user centred design approach. These prototypes were implemented on both the computer and mobile devices. The first prototype was computer based and the evaluation of this prototype showed that the gesture based interface improved the usability of sign language video communication. The second prototype is set up on the mobile device and it was tested on several mobile devices but the device limitation made it impossible to support all the features needed in the video communication. The different problems experienced on the dissimilar devices made the task of implementing the prototypes on the mobile platform challenging. The prototype was revised several times before it was tested on a different mobile phone. The final prototype used both the mobile phone and the computer. The computer served to simulate a mobile device with greater processing power. This approach simulated a more powerful future mobile device capable of running the gesture-based interface. The computer was used for video processing but to the user it was as if the whole system was running on the mobile phone. The evaluation process was conducted with ten Deaf users in order to determine the efficiency and usability of the prototype. The results showed that the majority of the users were satisfied with the quality of the video communication. The evaluation also revealed usability problems but the benefits of communicating in sign language outweighed the usability difficulties. Furthermore the users were more interested in the video communication on the mobile devices than on the computer as this was a much more familiar technology and offered the convenience of mobility
    corecore