437 research outputs found

    Brave New GES World:A Systematic Literature Review of Gestures and Referents in Gesture Elicitation Studies

    Get PDF
    How to determine highly effective and intuitive gesture sets for interactive systems tailored to end users’ preferences? A substantial body of knowledge is available on this topic, among which gesture elicitation studies stand out distinctively. In these studies, end users are invited to propose gestures for specific referents, which are the functions to control for an interactive system. The vast majority of gesture elicitation studies conclude with a consensus gesture set identified following a process of consensus or agreement analysis. However, the information about specific gesture sets determined for specific applications is scattered across a wide landscape of disconnected scientific publications, which poses challenges to researchers and practitioners to effectively harness this body of knowledge. To address this challenge, we conducted a systematic literature review and examined a corpus of N=267 studies encompassing a total of 187, 265 gestures elicited from 6, 659 participants for 4, 106 referents. To understand similarities in users’ gesture preferences within this extensive dataset, we analyzed a sample of 2, 304 gestures extracted from the studies identified in our literature review. Our approach consisted of (i) identifying the context of use represented by end users, devices, platforms, and gesture sensing technology, (ii) categorizing the referents, (iii) classifying the gestures elicited for those referents, and (iv) cataloging the gestures based on their representation and implementation modalities. Drawing from the findings of this review, we propose guidelines for conducting future end-user gesture elicitation studies

    An investigation into gaze-based interaction techniques for people with motor impairments

    Get PDF
    The use of eye movements to interact with computers offers opportunities for people with impaired motor ability to overcome the difficulties they often face using hand-held input devices. Computer games have become a major form of entertainment, and also provide opportunities for social interaction in multi-player environments. Games are also being used increasingly in education to motivate and engage young people. It is important that young people with motor impairments are able to benefit from, and enjoy, them. This thesis describes a program of research conducted over a 20-year period starting in the early 1990's that has investigated interaction techniques based on gaze position intended for use by people with motor impairments. The work investigates how to make standard software applications accessible by gaze, so that no particular modification to the application is needed. The work divides into 3 phases. In the first phase, ways of using gaze to interact with the graphical user interfaces of office applications were investigated, designed around the limitations of gaze interaction. Of these, overcoming the inherent inaccuracies of pointing by gaze at on-screen targets was particularly important. In the second phase, the focus shifted from office applications towards immersive games and on-line virtual worlds. Different means of using gaze position and patterns of eye movements, or gaze gestures, to issue commands were studied. Most of the testing and evaluation studies in this, like the first, used participants without motor-impairments. The third phase of the work then studied the applicability of the research findings thus far to groups of people with motor impairments, and in particular,the means of adapting the interaction techniques to individual abilities. In summary, the research has shown that collections of specialised gaze-based interaction techniques can be built as an effective means of completing the tasks in specific types of games and how these can be adapted to the differing abilities of individuals with motor impairments

    The development of a SmartAbility Framework to enhance multimodal interaction for people with reduced physical ability.

    Get PDF
    Assistive technologies are an evolving market due to the number of people worldwide who have conditions resulting in reduced physical ability (also known as disability). Various classification schemes exist to categorise disabilities, as well as government legislations to ensure equal opportunities within the community. However, there is a notable absence of a process to map physical conditions to technologies in order to improve Quality of Life for this user group. This research is characterised primarily under the Human Computer Interaction (HCI) domain, although aspects of Systems of Systems (SoS) and Assistive Technologies have been applied. The thesis focuses on examples of multimodal interactions leading to the development of a SmartAbility Framework that aims to assist people with reduced physical ability by utilising their abilities to suggest interaction mediums and technologies. The framework was developed through a predominantly Interpretivism methodology approach consisting of a variety of research methods including state- of-the-art literature reviews, requirements elicitation, feasibility trials and controlled usability evaluations to compare multimodal interactions. The developed framework was subsequently validated through the involvement of the intended user community and domain experts and supported by a concept demonstrator incorporating the SmartATRS case study. The aim and objectives of this research were achieved through the following key outputs and findings: - A comprehensive state-of-the-art literature review focussing on physical conditions and their classifications, HCI concepts relevant to multimodal interaction (Ergonomics of human-system interaction, Design For All and Universal Design), SoS definition and analysis techniques involving System of Interest (SoI), and currently-available products with potential uses as assistive technologies. - A two-phased requirements elicitation process applying surveys and semi-structured interviews to elicit the daily challenges for people with reduced physical ability, their interests in technology and the requirements for assistive technologies obtained through collaboration with a manufacturer. - Findings from feasibility trials involving monitoring brain activity using an electroencephalograph (EEG), tracking facial features through Tracking Learning Detection (TLD), applying iOS Switch Control to track head movements and investigating smartglasses. - Results of controlled usability evaluations comparing multimodal interactions with the technologies deemed to be feasible from the trials. The user community of people with reduced physical ability were involved during the process to maximise the usefulness of the data obtained. - An initial SmartDisability Framework developed from the results and observations ascertained through requirements elicitation, feasibility trials and controlled usability evaluations, which was validated through an approach of semi-structured interviews and a focus group. - An enhanced SmartAbility Framework to address the SmartDisability validation feedback by reducing the number of elements, using simplified and positive terminology and incorporating concepts from Quality Function Deployment (QFD). - A final consolidated version of the SmartAbility Framework that has been validated through semi-structured interviews with additional domain experts and addressed all key suggestions. The results demonstrated that it is possible to map technologies to people with physical conditions by considering the abilities that they can perform independently without external support and the exertion of significant physical effort. This led to a realisation that the term ‘disability’ has a negative connotation that can be avoided through the use of the phrase ‘reduced physical ability’. It is important to promote this rationale to the wider community, through exploitation of the framework. This requires a SmartAbility smartphone application to be developed that allows users to input their abilities in order for recommendations of interaction mediums and technologies to be provided
    • …
    corecore