3 research outputs found

    Gesture Based Semantic Service Invocation for Human Environment Interaction

    No full text
    The assistance of users in their activities of daily life by a smart environment is the main goal of Ambient Assisted Living (AAL). In this case, interaction is of particular interest since some users are very familiar with modern technology and for some users this technology is very challanging so that poorly designed interaction metaphors will lead to a low acceptance. Additionally, AAL has to cope with the challenges of open systems in which at any time new devices and functionalities can appear. This paper presents a gesture based approach to control devices and their functionalities in a smart environment at a semantic level to issue a command or to set a level. Redundant functionalities are filtered out before presenting the list of functions to the user. This concept is validated by a demonstrator that uses the semantic AAL platform universAAL

    Gesture Based Semantic Service Invocation for Human Environment Interaction

    No full text
    The assistance of users in their activities of daily life by a smart environment is the main goal of Ambient Assisted Living (AAL). In this case, interaction is of particular interest since some users are very familiar with modern technology and for some users this technology is very challanging so that poorly designed interaction metaphors will lead to a low acceptance. Additionally, AAL has to cope with the challenges of open systems in which at any time new devices and functionalities can appear. This paper presents a gesture based approach to control devices and their functionalities in a smart environment at a semantic level to issue a command or to set a level. Redundant functionalities are filtered out before presenting the list of functions to the user. This concept is validated by a demonstrator that uses the semantic AAL platform universAAL

    Context-aware gestural interaction in the smart environments of the ubiquitous computing era

    Get PDF
    A thesis submitted to the University of Bedfordshire in partial fulfilment of the requirements for the degree of Doctor of PhilosophyTechnology is becoming pervasive and the current interfaces are not adequate for the interaction with the smart environments of the ubiquitous computing era. Recently, researchers have started to address this issue introducing the concept of natural user interface, which is mainly based on gestural interactions. Many issues are still open in this emerging domain and, in particular, there is a lack of common guidelines for coherent implementation of gestural interfaces. This research investigates gestural interactions between humans and smart environments. It proposes a novel framework for the high-level organization of the context information. The framework is conceived to provide the support for a novel approach using functional gestures to reduce the gesture ambiguity and the number of gestures in taxonomies and improve the usability. In order to validate this framework, a proof-of-concept has been developed. A prototype has been developed by implementing a novel method for the view-invariant recognition of deictic and dynamic gestures. Tests have been conducted to assess the gesture recognition accuracy and the usability of the interfaces developed following the proposed framework. The results show that the method provides optimal gesture recognition from very different view-points whilst the usability tests have yielded high scores. Further investigation on the context information has been performed tackling the problem of user status. It is intended as human activity and a technique based on an innovative application of electromyography is proposed. The tests show that the proposed technique has achieved good activity recognition accuracy. The context is treated also as system status. In ubiquitous computing, the system can adopt different paradigms: wearable, environmental and pervasive. A novel paradigm, called synergistic paradigm, is presented combining the advantages of the wearable and environmental paradigms. Moreover, it augments the interaction possibilities of the user and ensures better gesture recognition accuracy than with the other paradigms
    corecore