1 research outputs found
Improving Multi-Touch Interactions Using Hands as Landmarks
Efficient command selection is just as important for multi-touch devices as it is for traditional interfaces that follow the Windows-Icons-Menus-Pointers (WIMP) model, but rapid selection in touch interfaces can be difficult because these systems often lack the mechanisms that have been used for expert shortcuts in desktop systems (such as keyboards shortcuts). Although interaction techniques based on spatial memory can improve the situation by allowing fast revisitation from memory, the lack of landmarks often makes it hard to remember command locations in a large set. One potential landmark that could be used in touch interfaces, however, is people’s hands and fingers: these provide an external reference frame that is well known and always present when interacting with a touch display. To explore the use of hands as landmarks for improving command selection, we designed hand-centric techniques called HandMark menus. We implemented HandMark menus for two platforms – one version that allows bimanual operation for digital tables and another that uses single-handed serial operation for handheld tablets; in addition, we developed variants for both platforms that support different numbers of commands. We tested the new techniques against standard selection methods including tabbed menus and popup toolbars. The results of the studies show that HandMark menus perform well (in several cases significantly faster than standard methods), and that they support the development of spatial memory. Overall, this thesis demonstrates that people’s intimate knowledge of their hands can be the basis for fast interaction techniques that improve performance and usability of multi-touch systems