30 research outputs found

    Making bare hand input more accurate

    Get PDF

    Drifts, slips, and misses : input accuracy for touch surfaces

    Get PDF
    Touch screens allow users to interact with virtual objects directly below their fingertips. The proximity of input and output blurs the line between the physical and the virtual world, allowing the interactions to feel natural. However, direct finger input has several limitations. Compared to the tip of a graphics stylus, a fingertip is bigger and softer, making it more likely to occlude the screen and generate ambiguous touch signals. Furthermore, touch screens register any contact, making it more likely that they will respond to unintentional input, such as pressing a button when a finger just brushes pass it. These two problems exemplify issues in two types of input accuracy: space accuracy (“where it is being touched”) and state accuracy (“whether it is being touched”).In this thesis, we investigate space and state accuracy in four usage scenarios.First, we focused on users with hand tremors, whose involuntary finger oscillation causes them to miss targets and creates spurious touches and releases. To improve touch screen accessibility, we investigated how tremors influence touch input. Then, we designed and evaluated an alternative interaction technique that leverages the tremor movement characteristics for more accurate input.Second, we addressed a state accuracy problem in indirect multi-touch systems, in which a horizontal multi-touch screen is used to control cursors on a vertical display for ergonomic usage. We operationalized measures for state slips and compared four techniques for controlling the state of cursors.Third, we augmented touch screens with near-surface interaction by sensing fingers hovering in a thin layer above the screen surface. We determined appropriate layer thickness to minimize the likelihood that the fingers will slip out of the layer.Finally, we tackled the problem where touch contacts drift away from buttons when users employ touch screens without looking at them. Here, we assessed how magnetic forces might substitute for vision by guiding the fingertips towards the button in these scenarios.While the findings contribute to the body of scientific knowledge in each specific usage scenario, the insights derived from all four scenarios in combination suggest strategies for designing touch interaction techniques to maximize space and state accuracy

    Drifts, slips, and misses : input accuracy for touch surfaces

    No full text
    Touch screens allow users to interact with virtual objects directly below their fingertips. The proximity of input and output blurs the line between the physical and the virtual world, allowing the interactions to feel natural. However, direct finger input has several limitations. Compared to the tip of a graphics stylus, a fingertip is bigger and softer, making it more likely to occlude the screen and generate ambiguous touch signals. Furthermore, touch screens register any contact, making it more likely that they will respond to unintentional input, such as pressing a button when a finger just brushes pass it. These two problems exemplify issues in two types of input accuracy: space accuracy (“where it is being touched”) and state accuracy (“whether it is being touched”).In this thesis, we investigate space and state accuracy in four usage scenarios.First, we focused on users with hand tremors, whose involuntary finger oscillation causes them to miss targets and creates spurious touches and releases. To improve touch screen accessibility, we investigated how tremors influence touch input. Then, we designed and evaluated an alternative interaction technique that leverages the tremor movement characteristics for more accurate input.Second, we addressed a state accuracy problem in indirect multi-touch systems, in which a horizontal multi-touch screen is used to control cursors on a vertical display for ergonomic usage. We operationalized measures for state slips and compared four techniques for controlling the state of cursors.Third, we augmented touch screens with near-surface interaction by sensing fingers hovering in a thin layer above the screen surface. We determined appropriate layer thickness to minimize the likelihood that the fingers will slip out of the layer.Finally, we tackled the problem where touch contacts drift away from buttons when users employ touch screens without looking at them. Here, we assessed how magnetic forces might substitute for vision by guiding the fingertips towards the button in these scenarios.While the findings contribute to the body of scientific knowledge in each specific usage scenario, the insights derived from all four scenarios in combination suggest strategies for designing touch interaction techniques to maximize space and state accuracy

    A Comparison of a Transition-based and a Sequence-based Analysis of AOI Transition Sequences

    Full text link
    Several visual analytics (VA) systems are used for analyzing eye-tracking data because they synergize human-in-the-loop exploration with speed and accuracy of the computer. In the VA systems, the choices of visualization techniques could afford discovering certain types of insights while hindering others. Understanding these affordances and hindrances is essential to design effective VA systems. In this paper, we focus on two approaches for visualizing AOI transitions: the transition based approach (exemplified by the radial transition graph, RTG) and the sequence-based approach (exemplified by the Alpscarf). We captured the insights generated by two analysts who individually use each visualization technique on the same dataset. Based on the results, we identify four phases of analytic activities and discuss opportunities that the two visualization approaches can complement each other. We point out design implications for VA systems that combine these visualization approaches

    Understanding finger input above desktop devices

    Get PDF
    Using the space above desktop input devices adds a rich new input channel to desktop interaction. Input in this elevated layer has been previously used to modify the granularity of a 2D slider, navigate layers of a 3D body scan above a mul-titouch table and access vertically stacked menus. However, designing these interactions is challenging because the lack of haptic and direct visual feedback easily leads to input er-rors. For bare finger input, the user’s fingers needs to reliably enter and stay inside the interactive layer, and engagement techniques such as midair clicking have to be disambiguated from leaving the layer. These issues have been addressed for interactions in which users operate other devices in midair, but there is little guidance for the design of bare finger input in this space. In this paper, we present the results of two user studies that inform the design of finger input above desktop devices. Our studies show that 2 cm is the minimum thickness of the above-surface volume that users can reliably remain within. We found that when accessing midair layers, users do not au-tomatically move to the same height. To address this, we in-troduce a technique that dynamically determines the height at which the layer is placed, depending on the velocity pro-file of the user’s initial finger movement into midair. Finally, we propose a technique that reliably distinguishes clicking from homing movements, based on the user’s hand shape. We structure the presentation of our findings using Buxton’s three-state input model, adding additional states and transi-tions for above-surface interactions

    An Evaluation of State Switching Methods for Indirect Touch Systems

    No full text
    Indirect touch systems combine a horizontal touch input surface with a vertical display for output. While this division is ergonomically superior to simple direct-touch displays for many tasks, users are no longer looking at their hands when touching. This requires the system to support an intermediate Tracking state that lets users aim at objects without triggering a selection, similar to the hover state in mouse-based UIs. We present an empirical analysis of several interaction techniques for indirect touch systems to switch to this intermediate state, and derive design recommendations for incorporating it into such systems
    corecore