5 research outputs found

    Mobile gaze interaction : gaze gestures with haptic feedback

    Get PDF
    There has been an increasing need for alternate interaction techniques to support mobile usage context. Gaze tracking technology is anticipated to soon appear in commercial mobile devices. There are two important considerations when designing mobile gaze interactions. Firstly, the interaction should be robust to accuracy problems. Secondly, user feedback should be instantaneous, meaningful and appropriate to ease the interaction. This thesis proposes gaze gesture input with haptic feedback as an interaction technique in the mobile context. This work presents the results of an experiment that was conducted to understand the effectiveness of vibrotactile feedback in two stroke gaze gesture based mobile interaction and to find the best temporal point in terms of gesture progression to provide the feedback. Four feedback conditions were used, NO (no tactile feedback), OUT (tactile feedback at the end of first stroke), FULL (tactile feedback at the end of second stroke) and BOTH (tactile feedback at the end of first and second strokes). The results suggest that haptic feedback does help the interaction. The participants completed the tasks with fewer errors when haptic feedback was provided. The feedback conditions OUT and BOTH were found to be equally effective in terms of task completion time. The participants also subjectively rated these feedback conditions as being more comfortable and easier to use than FULL and NO feedback conditions

    Mobile gaze interaction : gaze gestures with haptic feedback

    Get PDF
    There has been an increasing need for alternate interaction techniques to support mobile usage context. Gaze tracking technology is anticipated to soon appear in commercial mobile devices. There are two important considerations when designing mobile gaze interactions. Firstly, the interaction should be robust to accuracy problems. Secondly, user feedback should be instantaneous, meaningful and appropriate to ease the interaction. This thesis proposes gaze gesture input with haptic feedback as an interaction technique in the mobile context. This work presents the results of an experiment that was conducted to understand the effectiveness of vibrotactile feedback in two stroke gaze gesture based mobile interaction and to find the best temporal point in terms of gesture progression to provide the feedback. Four feedback conditions were used, NO (no tactile feedback), OUT (tactile feedback at the end of first stroke), FULL (tactile feedback at the end of second stroke) and BOTH (tactile feedback at the end of first and second strokes). The results suggest that haptic feedback does help the interaction. The participants completed the tasks with fewer errors when haptic feedback was provided. The feedback conditions OUT and BOTH were found to be equally effective in terms of task completion time. The participants also subjectively rated these feedback conditions as being more comfortable and easier to use than FULL and NO feedback conditions

    Haptic gaze-tracking based perception of graphical user interfaces

    Get PDF
    This paper presents a novel human-computer interface that enables the computer display to be perceived without any use of the eyes. Our system works by tracking the user\u27s head position and orientation to obtain their \u27gaze\u27 point on a virtual screen, and by indicating to the user what object is present at the gaze location via haptic feedback to the fingers and synthetic speech or Braille text. This is achieved by using the haptic vibration frequency delivered to the fingers to indicate the type of screen object at the gaze position, and the vibration amplitude to indicate the screen object\u27s window-layer, when the object is contained in overlapping windows. Also, objects that are gazed at momentarily have their name output to the user via a Braille display or synthetic speech. Our experiments have shown that by browsing over the screen and receiving haptic and voice (or Braille) feedback in this manner, the user is able to acquire a mental two-dimensional representation of the virtual screen and its content without any use of the eyes. This form of blind screen perception can then be used to locate screen objects and controls and manipulate them with the mouse or via gaze control. Our experimental results are provided in which we demonstrate how this form of blind screen perception can effectively be used to exercise point-and-click and drag-and-drop control of desktop objects and open windows by using the mouse, or the user\u27s head pose, without any use of the eyes

    Haptic Gaze-Tracking Based Perception of Graphical User Interfaces

    No full text
    This paper presents a novel human-computer interface that enables the computer display to be perceived without any use of the eyes. Our system works by tracking the user’s head position and orientation to obtain their ‘gaze ’ point on a virtual screen, and by indicating to the user what object is present at the gaze location via haptic feedback to the fingers and synthetic speech or Braille text. This is achieved by using the haptic vibration frequency delivered to the fingers to indicate the type of screen object at the gaze position, and the vibration amplitude to indicate the screen object’s window-layer, when the object is contained in overlapping windows. Also, objects that are gazed at momentarily have their name output to the user via a Braille display or synthetic speech. Our experiments have shown that by browsing over the screen and receiving haptic and voice (or Braille) feedback in this manner, the user is able to acquire a mental two-dimensional representation of the virtual screen and its content without any use of the eyes. This form of blind screen perception can then be used to locate screen objects and controls and manipulate them with the mouse or via gaze control. Our experimental results are provided in which we demonstrate how this form of blind screen perception can effectively be used to exercise point-and-click and drag-and-drop control of desktop objects and open windows by using the mouse, or the user’s head pose, without any use of the eyes.
    corecore