3 research outputs found

    AUGMENTED TOUCH INTERACTIONS WITH FINGER CONTACT SHAPE AND ORIENTATION

    Get PDF
    Touchscreen interactions are far less expressive than the range of touch that human hands are capable of - even considering technologies such as multi-touch and force-sensitive surfaces. Recently, some touchscreens have added the capability to sense the actual contact area of a finger on the touch surface, which provides additional degrees of freedom - the size and shape of the touch, and the finger's orientation. These additional sensory capabilities hold promise for increasing the expressiveness of touch interactions - but little is known about whether users can successfully use the new degrees of freedom. To provide this baseline information, we carried out a study with a finger-contact-sensing touchscreen, and asked participants to produce a range of touches and gestures with different shapes and orientations, with both one and two fingers. We found that people are able to reliably produce two touch shapes and three orientations across a wide range of touches and gestures - a result that was confirmed in another study that used the augmented touches for a screen lock application

    Barehand Mode Switching in Touch and Mid-Air Interfaces

    Get PDF
    Raskin defines a mode as a distinct setting within an interface where the same user input will produce results different to those it would produce in other settings. Most interfaces have multiple modes in which input is mapped to different actions, and, mode-switching is simply the transition from one mode to another. In touch interfaces, the current mode can change how a single touch is interpreted: for example, it could draw a line, pan the canvas, select a shape, or enter a command. In Virtual Reality (VR), a hand gesture-based 3D modelling application may have different modes for object creation, selection, and transformation. Depending on the mode, the movement of the hand is interpreted differently. However, one of the crucial factors determining the effectiveness of an interface is user productivity. Mode-switching time of different input techniques, either in a touch interface or in a mid-air interface, affects user productivity. Moreover, when touch and mid-air interfaces like VR are combined, making informed decisions pertaining to the mode assignment gets even more complicated. This thesis provides an empirical investigation to characterize the mode switching phenomenon in barehand touch-based and mid-air interfaces. It explores the potential of using these input spaces together for a productivity application in VR. And, it concludes with a step towards defining and evaluating the multi-faceted mode concept, its characteristics and its utility, when designing user interfaces more generally

    mode switching techniques through pen and device profiles

    No full text
    In pen-based interfaces, inking and gesturing are two central tasks, and switching from inking to gesturing is an important issue. Previous studies have focused on mode switching in pen-based desktop devices. However, because pen-based mobile devices are smaller and more mobile than pen-based desktop devices, the principles in mode switching techniques for pen-based desktop devices may not apply to pen-based mobile devices. In this paper, we investigated five techniques for switching between ink and gesture modes in two form factors of pen-based mobile devices respectively: PDA and Tablet PC. Two quantitative experiments were conducted to evaluate these mode switching techniques. Results showed that in Tablet PC, pressure performed the fastest but resulted in the most errors. In PDA, back tapping offered the fastest performance. Although pressing and holding was significantly slower than the other techniques, it resulted in the fewest errors in Tablet PC and PDA. Pressing button on handheld device offered overall fast and accurate performance in Tablet PC and PDA. Copyright 2012 ACM.ACM SIGCHI; Human-Centered Design Organization (HCD-Net)In pen-based interfaces, inking and gesturing are two central tasks, and switching from inking to gesturing is an important issue. Previous studies have focused on mode switching in pen-based desktop devices. However, because pen-based mobile devices are smaller and more mobile than pen-based desktop devices, the principles in mode switching techniques for pen-based desktop devices may not apply to pen-based mobile devices. In this paper, we investigated five techniques for switching between ink and gesture modes in two form factors of pen-based mobile devices respectively: PDA and Tablet PC. Two quantitative experiments were conducted to evaluate these mode switching techniques. Results showed that in Tablet PC, pressure performed the fastest but resulted in the most errors. In PDA, back tapping offered the fastest performance. Although pressing and holding was significantly slower than the other techniques, it resulted in the fewest errors in Tablet PC and PDA. Pressing button on handheld device offered overall fast and accurate performance in Tablet PC and PDA. Copyright 2012 ACM
    corecore