32 research outputs found

    Keeping an eye on the game: Eye gaze interaction with massively multiplayer online games and virtual communities for motor impaired users.

    Get PDF
    Online virtual communities are becoming increasingly popular both within the able-bodied and disabled user communities. These games assume the use of keyboard and mouse as standard input devices, which in some cases is not appropriate for users with a disability. This paper explores gaze-based interaction methods and highlights the problems associated with gaze control of online virtual worlds. The paper then presents a novel ‘Snap Clutch’ software tool that addresses these problems and enables gaze control. The tool is tested with an experiment showing that effective gaze control is possible although task times are longer. Errors caused by gaze control are identified and potential methods for reducing these are discussed. Finally, the paper demonstrates that gaze driven locomotion can potentially achieve parity with mouse and keyboard driven locomotion, and shows that gaze is a viable modality for game based locomotion both for able-bodied and disabled users alike

    Gaze modulated disambiguation technique for gesture control in 3D virtual objects selection

    Get PDF
    © 2017 IEEE. Inputs with multimodal information provide more natural ways to interact with virtual 3D environment. An emerging technique that integrates gaze modulated pointing with mid-air gesture control enables fast target acquisition and rich control expressions. The performance of this technique relies on the eye tracking accuracy which is not comparable with the traditional pointing techniques (e.g., mouse) yet. This will cause troubles when fine grainy interactions are required, such as selecting in a dense virtual scene where proximity and occlusion are prone to occur. This paper proposes a coarse-to-fine solution to compensate the degradation introduced by eye tracking inaccuracy using a gaze cone to detect ambiguity and then a gaze probe for decluttering. It is tested in a comparative experiment which involves 12 participants with 3240 runs. The results show that the proposed technique enhanced the selection accuracy and user experience but it is still with a potential to be improved in efficiency. This study contributes to providing a robust multimodal interface design supported by both eye tracking and mid-air gesture control

    BimodalGaze:Seamlessly Refined Pointing with Gaze and Filtered Gestural Head Movement

    Get PDF
    Eye gaze is a fast and ergonomic modality for pointing but limited in precision and accuracy. In this work, we introduce BimodalGaze, a novel technique for seamless head-based refinement of a gaze cursor. The technique leverages eye-head coordination insights to separate natural from gestural head movement. This allows users to quickly shift their gaze to targets over larger fields of view with naturally combined eye-head movement, and to refine the cursor position with gestural head movement. In contrast to an existing baseline, head refinement is invoked automatically, and only if a target is not already acquired by the initial gaze shift. Study results show that users reliably achieve fine-grained target selection, but we observed a higher rate of initial selection errors affecting overall performance. An in-depth analysis of user performance provides insight into the classification of natural versus gestural head movement, for improvement of BimodalGaze and other potential applications

    Haptic feedback in eye typing

    Get PDF
    Proper feedback is essential in gaze based interfaces, where the same modality is used for both perception and control. We measured how vibrotactile feedback, a form of haptic feedback, compares with the commonly used visual and auditory feedback in eye typing. Haptic feedback was found to produce results that are close to those of auditory feedback; both were easy to perceive and participants liked both the auditory ”click” and the tactile “tap” of the selected key. Implementation details (such as the placement of the haptic actuator) were also found important

    Gaze+Hold: Eyes-only Direct Manipulation with Continuous Gaze Modulated by Closure of One Eye

    Get PDF
    The eyes are coupled in their gaze function and therefore usually treated as a single input channel, limiting the range of interactions. However, people are able to open and close one eye while still gazing with the other. We introduce Gaze+Hold as an eyes-only technique that builds on this ability to leverage the eyes as separate input channels, with one eye modulating the state of interaction while the other provides continuous input. Gaze+Hold enables direct manipulation beyond pointing which we explore through the design of Gaze+Hold techniques for a range of user interface tasks. In a user study, we evaluated performance, usability and user’s spontaneous choice of eye for modulation of input. The results show that users are effective with Gaze+Hold. The choice of dominant versus non-dominant eye had no effect on performance, perceived usability and workload. This is significant for the utility of Gaze+Hold as it affords flexibility for mapping of either eye in different configurations

    ReType:Quick Text Editing with Keyboard and Gaze

    Get PDF
    When a user needs to reposition the cursor during text editing, this is often done using the mouse. For experienced typists especially, the switch between keyboard and mouse can slow down the keyboard editing workflow considerably. To address this we propose ReType, a new gaze-assisted positioning technique combining keyboard with gaze input based on a new ‘patching’ metaphor. ReType allows users to perform some common editing operations while keeping their hands on the keyboard. We present the result of two studies. A free-use study indicated that ReType enhances the user experience of text editing. ReType was liked by many participants, regardless of their typing skills. A comparative user study showed that ReType is able to match or even beat the speed of mouse-based interaction for small text edits. We conclude that the gaze-augmented user interface can make common interactions more fluent, especially for professional keyboard users

    CIRCLING INTERFACE: AN ALTERNATIVE INTERACTION METHOD FOR ON-SCREEN OBJECT MANIPULATION

    Get PDF
    An alternative interaction method, called the circling interface, was developed and evaluated for individuals with disabilities who find it difficult or impossible to consistently and efficiently perform pointing operations involving the left and right mouse buttons. The circling interface is a gesture-based interaction technique. To specify a target of interest, the user makes a circling motion around the target. To specify a desired pointing command with the circling interface, each edge of the screen is used. The user selects a command before circling the target. Empirical evaluations were conducted with human subjects from three different groups (individuals without disability, individuals with spinal cord injury, and individuals with cerebral palsy), comparing each group's performance on pointing tasks with the circling interface to performance on the same tasks when using a mouse button or dwell-clicking software. Across all three groups, the circling interface was faster than the dwelling interface (although the difference was not statistically significant). For the single-click operation, the circling interface was slower than dwell selection, but for both double-click and drag-and-drop operations, the circling interface was faster. In terms of performance accuracy, the results were mixed: for able-bodied subjects circling was more accurate than dwelling, for subjects with SCI dwelling was more accurate than circling, and for subjects with CP there was no difference. However, if errors caused by circling on an area with no target or by ignoring circles that are too small or too fast were automatically corrected by the circling interface, the performance accuracy of the circling interface would significantly outperform dwell selection. This suggests that the circling interface can be used in conjunction with existing pointing techniques and this combined approach may provide more effective mouse use for people with pointing problems. Consequently, the circling interface can improve clinical practice by providing an alternative pointing method that does not require physically activating mouse buttons and is more efficient than dwell-clicking. It is also expected to be useful for both computer access and augmentative communication software

    Answering a Questionnaire Using Eyetracking

    Get PDF
    The beginning of eye tracking research lies far back in the past. Since eye tracking costs decreased over the past years, the usage of an eye tracker for everyday matters, like the interaction with a personal device, becomes more and more attractive. In the present work, the realization of interacting with a computer interface with only the help of an eye tracker is illustrated. The conducted study examines the acceptance and usability of such a system. Therefore, three different interaction methods have been implemented. In a study, the participants had to complete a questionnaire with those interaction methods using a Windows application and a low-cost eye tracking device. All in all, the study results imply that the number of negative aspects about this system outweigh the positive ones. The biggest issue was the restriction of mobility during the usage of the tracking device. In addition, the usage of the system turned out to be rather exhausting for the eyes. Generally speaking, among the three implemented interaction methods, the interaction method that combines gaze with a second input modality (a keyboard) scored best in terms of interaction speed and usefulness considering the completion of a questionnaire
    corecore