1,112 research outputs found

    Eye-gaze interaction techniques for use in online games and environments for users with severe physical disabilities.

    Get PDF
    Multi-User Virtual Environments (MUVEs) and Massively Multi-player On- line Games (MMOGs) are a popular, immersive genre of computer game. For some disabled users, eye-gaze offers the only input modality with the potential for sufficiently high bandwidth to support the range of time-critical interaction tasks required to play. Although, there has been much research into gaze interaction techniques for computer interaction over the past twenty years, much of this has focused on 2D desktop application control. There has been some work that investigates the use of gaze interaction as an additional input device for gaming but very little on using gaze on its own. Further, configuration of these techniques usually requires expert knowledge often beyond the capabilities of a parent, carer or support worker. The work presented in this thesis addresses these issues by the investigation of novel gaze-only interaction techniques. These are to enable at least a beginner level of game play to take place together with a means of adapting the techniques to suit an individual. To achieve this, a collection of novel gaze based interaction techniques have been evaluated through empirical studies. These have been encompassed within an extensible software architecture that has been made available for free download. Further, a metric of reliability is developed that when used as a measure within a specially designed diagnostic test, allows the interaction technique to be adapted to suit an individual. Methods of selecting interaction techniques based upon game task are also explored and a novel methodology based on expert task analysis is developed to aid selection

    An investigation into gaze-based interaction techniques for people with motor impairments

    Get PDF
    The use of eye movements to interact with computers offers opportunities for people with impaired motor ability to overcome the difficulties they often face using hand-held input devices. Computer games have become a major form of entertainment, and also provide opportunities for social interaction in multi-player environments. Games are also being used increasingly in education to motivate and engage young people. It is important that young people with motor impairments are able to benefit from, and enjoy, them. This thesis describes a program of research conducted over a 20-year period starting in the early 1990's that has investigated interaction techniques based on gaze position intended for use by people with motor impairments. The work investigates how to make standard software applications accessible by gaze, so that no particular modification to the application is needed. The work divides into 3 phases. In the first phase, ways of using gaze to interact with the graphical user interfaces of office applications were investigated, designed around the limitations of gaze interaction. Of these, overcoming the inherent inaccuracies of pointing by gaze at on-screen targets was particularly important. In the second phase, the focus shifted from office applications towards immersive games and on-line virtual worlds. Different means of using gaze position and patterns of eye movements, or gaze gestures, to issue commands were studied. Most of the testing and evaluation studies in this, like the first, used participants without motor-impairments. The third phase of the work then studied the applicability of the research findings thus far to groups of people with motor impairments, and in particular,the means of adapting the interaction techniques to individual abilities. In summary, the research has shown that collections of specialised gaze-based interaction techniques can be built as an effective means of completing the tasks in specific types of games and how these can be adapted to the differing abilities of individuals with motor impairments

    Dwell-free input methods for people with motor impairments

    Full text link
    Millions of individuals affected by disorders or injuries that cause severe motor impairments have difficulty performing compound manipulations using traditional input devices. This thesis first explores how effective various assistive technologies are for people with motor impairments. The following questions are studied: (1) What activities are performed? (2) What tools are used to support these activities? (3) What are the advantages and limitations of these tools? (4) How do users learn about and choose assistive technologies? (5) Why do users adopt or abandon certain tools? A qualitative study of fifteen people with motor impairments indicates that users have strong needs for efficient text entry and communication tools that are not met by existing technologies. To address these needs, this thesis proposes three dwell-free input methods, designed to improve the efficacy of target selection and text entry based on eye-tracking and head-tracking systems. They yield: (1) the Target Reverse Crossing selection mechanism, (2) the EyeSwipe eye-typing interface, and (3) the HGaze Typing interface. With Target Reverse Crossing, a user moves the cursor into a target and reverses over a goal to select it. This mechanism is significantly more efficient than dwell-time selection. Target Reverse Crossing is then adapted in EyeSwipe to delineate the start and end of a word that is eye-typed with a gaze path connecting the intermediate characters (as with traditional gesture typing). When compared with a dwell-based virtual keyboard, EyeSwipe affords higher text entry rates and a more comfortable interaction. Finally, HGaze Typing adds head gestures to gaze-path-based text entry to enable simple and explicit command activations. Results from a user study demonstrate that HGaze Typing has better performance and user satisfaction than a dwell-time method

    Selection strategies in gaze interaction

    Get PDF
    This thesis deals with selection strategies in gaze interaction, specifically for a context where gaze is the sole input modality for users with severe motor impairments. The goal has been to contribute to the subfield of assistive technology where gaze interaction is necessary for the user to achieve autonomous communication and environmental control. From a theoretical point of view research has been done on the physiology of the gaze, eye tracking technology, and a taxonomy of existing selection strategies has been developed. Empirically two overall approaches have been taken. Firstly, end-user research has been conducted through interviews and observation. The capabilities, requirements, and wants of the end-user have been explored. Secondly, several applications have been developed to explore the selection strategy of single stroke gaze gestures (SSGG) and aspects of complex gaze gestures. The main finding is that single stroke gaze gestures can successfully be used as a selection strategy. Some of the features of SSGG are: That horizontal single stroke gaze gestures are faster than vertical single stroke gaze gestures; That there is a significant difference in completion time depending on gesture length; That single stroke gaze gestures can be completed without visual feedback; That gaze tracking equipment has a significant effect on the completion times and error rates of single stroke gaze gestures; That there is not a significantly greater chance of making selection errors with single stroke gaze gestures compared with dwell selection. The overall conclusion is that the future of gaze interaction should focus on developing multi-modal interactions for mono-modal input

    Pocket transfers : Interaction techniques for transferring content from situated displays to mobile devices

    Get PDF
    We present Pocket Transfers: interaction techniques that allow users to transfer content from situated displays to a personal mobile device while keeping the device in a pocket or bag. Existing content transfer solutions require direct manipulation of the mobile device, making inter-action slower and less flexible. Our introduced tech-niques employ touch, mid-air gestures, gaze, and a mul-timodal combination of gaze and mid-air gestures. We evaluated the techniques in a novel user study (N=20), where we considered dynamic scenarios where the user approaches the display, completes the task, and leaves. We show that all pocket transfer techniques are fast and seen as highly convenient. Mid-air gestures are the most efficient touchless method for transferring a single item, while the multimodal method is the fastest touchless method when multiple items are transferred. We provide guidelines to help researchers and practitioners choose the most suitable content transfer techniques for their systems

    AUGMENTED TOUCH INTERACTIONS WITH FINGER CONTACT SHAPE AND ORIENTATION

    Get PDF
    Touchscreen interactions are far less expressive than the range of touch that human hands are capable of - even considering technologies such as multi-touch and force-sensitive surfaces. Recently, some touchscreens have added the capability to sense the actual contact area of a finger on the touch surface, which provides additional degrees of freedom - the size and shape of the touch, and the finger's orientation. These additional sensory capabilities hold promise for increasing the expressiveness of touch interactions - but little is known about whether users can successfully use the new degrees of freedom. To provide this baseline information, we carried out a study with a finger-contact-sensing touchscreen, and asked participants to produce a range of touches and gestures with different shapes and orientations, with both one and two fingers. We found that people are able to reliably produce two touch shapes and three orientations across a wide range of touches and gestures - a result that was confirmed in another study that used the augmented touches for a screen lock application

    Noise Challenges in Monomodal Gaze Interaction

    Get PDF
    Modern graphical user interfaces (GUIs) are designed with able-bodied users in mind. Operating these interfaces can be impossible for some users who are unable to control the conventional mouse and keyboard. An eye tracking system offers possibilities for independent use and improved quality of life via dedicated interface tools especially tailored to the users ’ needs (e.g., interaction, communication, e-mailing, web browsing and entertainment). Much effort has been put towards robustness, accuracy and precision of modern eyetracking systems and there are many available on the market. Even though gaze tracking technologies have undergone dramatic improvements over the past years, the systems are still very imprecise. This thesis deals with current challenges of mono-modal gaze interaction and aims at improving access to technology and interface control for users who are limited to the eyes only. Low-cost equipment in eye tracking contributes toward improved affordability but potentially at the cost of introducing more noise in the system due to the lower quality of hardware. This implies that methods of dealing with noise and creative approaches towards getting the best out of the data stream are most wanted. The work in this thesis presents three contributions that may advance the use of low-cost mono-modal gaze tracking and research in the field:- An assessment of a low-cost open-source gaze tracker and two eye tracking systems through an accuracy and precision test and a performance evaluation.- Development and evaluation of a novel innovative 3D typing system with high tolerance to noise that is based on continuous panning and zooming.- Development and evaluation of novel selection tools that compensate for noisy input during small-target selections in modern GUIs. This thesis may be of particular interest for those working on the use of eye trackers for gaze interaction and how to deal with reduced data quality. The work in this thesis is accompanied by several software applications developed for the research projects that can be freely downloaded from the eyeInteract appstore 1
    • …
    corecore