610 research outputs found
Get a grip: Analysis of muscle activity and perceived comfort in using stylus grips
The design of handwriting instruments has been based primarily on touch, feel, aesthetics, and muscle exertion. Previous studies make it clear that different pen characteristics have to be considered along with hand-instrument interaction in the design of writing instruments. This should include pens designed for touch screens and computer based writing surfaces. Hence, this study focuses primarily on evaluating grip style’s impact on user comfort and muscle activity associated with handgrip while using a stylus-pen.
Surface EMG measures were taken approximate to the adductor pollicis, flexor digitorum, and extensor indicis of eight participants while they performed writing, drawing, and point-and-click tasks on a tablet using a standard stylus and two grip options. Participants were also timed and surveyed on comfort level for each trial. Results of this study indicate that participants overall felt using a grip was more comfortable than using a stylus alone. The claw grip was the preferred choice for writing and drawing, and the crossover grip was preferred for pointing and clicking. There was reduction in muscle activity of the extensor indicis using the claw or crossover grip for the drawing and point and click tasks. The reduced muscle activity and the perceived comfort shows the claw grip to be a viable option for improving comfort for writing or drawing on a touchscreen device
AUGMENTED TOUCH INTERACTIONS WITH FINGER CONTACT SHAPE AND ORIENTATION
Touchscreen interactions are far less expressive than the range of touch that human hands are capable of - even considering technologies such as multi-touch and force-sensitive surfaces. Recently, some touchscreens have added the capability to sense the actual contact area of a finger on the touch surface, which provides additional degrees of freedom - the size and shape of the touch, and the finger's orientation. These additional sensory capabilities hold promise for increasing the expressiveness of touch interactions - but little is known about whether users can successfully use the new degrees of freedom. To provide this baseline information, we carried out a study with a finger-contact-sensing touchscreen, and asked participants to produce a range of touches and gestures with different shapes and orientations, with both one and two fingers. We found that people are able to reliably produce two touch shapes and three orientations across a wide range of touches and gestures - a result that was confirmed in another study that used the augmented touches for a screen lock application
Improving Multi-Touch Interactions Using Hands as Landmarks
Efficient command selection is just as important for multi-touch devices as it is for traditional interfaces that follow the Windows-Icons-Menus-Pointers (WIMP) model, but rapid selection in touch interfaces can be difficult because these systems often lack the mechanisms that have been used for expert shortcuts in desktop systems (such as keyboards shortcuts). Although interaction techniques based on spatial memory can improve the situation by allowing fast revisitation from memory, the lack of landmarks often makes it hard to remember command locations in a large set. One potential landmark that could be used in touch interfaces, however, is people’s hands and fingers: these provide an external reference frame that is well known and always present when interacting with a touch display. To explore the use of hands as landmarks for improving command selection, we designed hand-centric techniques called HandMark menus. We implemented HandMark menus for two platforms – one version that allows bimanual operation for digital tables and another that uses single-handed serial operation for handheld tablets; in addition, we developed variants for both platforms that support different numbers of commands. We tested the new techniques against standard selection methods including tabbed menus and popup toolbars. The results of the studies show that HandMark menus perform well (in several cases significantly faster than standard methods), and that they support the development of spatial memory. Overall, this thesis demonstrates that people’s intimate knowledge of their hands can be the basis for fast interaction techniques that improve performance and usability of multi-touch systems
Tilt and Multitouch Input for Tablet Play of Real-Time Strategy Games
We are studying the use of tilt-enabled handheld touchscreen devices as an interface for top-down strategy games. We will explore how using different input modes (tilt and touch) compare for certain tasks in terms of efficiency and comfort. Real-time and turn-based strategy games are a popular form of electronic gaming, though these games currently have only minor representation on tablets. This genre of game requires both a wide variety of input and the display of a wealth of information. We are exploring whether, with suitable interface developments, this genre can become as accessible on tablet devices as on traditional computers. These interface approaches may also prove useful for expanding the presence of other game genres in the mobile space
The Mole: a pressure-sensitive mouse
The traditional mouse enables the positioning of a cursor in a 2D plane, as well as the interaction of binary elements within that plane (e.g., buttons, links, icons). While this basic functionality is sufficient for interacting with every modern computing environment, it makes little use of the human hand\u27s ability to perform complex multi-directional movements. Devices developed to capture these multi-directional capabilities typically lack the familiar form and function of the mouse. This thesis details the design and development of a pressure-sensitive device called The Mole. The Mole retains the familiar form and function of the mouse while passively measuring the magnitude of normal hand force (i.e., downward force normal to the 2D operating surface). The measurement of this force lends itself to the development of novel interactions, far beyond what is possible with a typical mouse. This thesis demonstrates two such interactions: the positioning of a cursor in 3D space, and the simultaneous manipulation of cursor position and graphic tool parameters
Recommended from our members
Moving Away From the Traditional Desktop Computer Workstations: Identifying Opportunities to Improve Upper Extremity Biomechanics
Statement of Problem: Office computer workers have elevated risks of adverse health outcome such as musculoskeletal disorders (MSDs) associated with computer work. Although they now have many alternatives, these modern computer workstations and associated technologies require new guidelines and recommendations for proper practice. We see this as an opportunity to improve current and future computer workstation designs through an ergonomics approach by improving users’ upper extremity biomechanics while interacting with these modern technologies.
Method: The dissertation first utilized a psychophysical protocol to compare users’ self-selected set ups for sitting and standing computer workstations. Users’ biomechanics and perceived comfort across different computer tasks for the two workstations are then compared. Subsequently, a hand mapping technique was developed to evaluate effects of computer pointing devices on users’ hand posture and associated forearm muscle effort using 3-D motion analysis and surface electromyography. To improve mobile device ergonomics, we investigated tablet users’ biomechanical load, comfort level and performance while performing swipe actions at different tablet locations.
Results: Different selected computer workstation set ups were found for sitting and standing. Compared to sitting, users while standing kept workstation components closer to their sternum and adopted a more neutral shoulder posture while working. However, users had greater wrist extension and started reporting more low back discomfort after 45 minutes. While investigating different computer pointing devices, we found device affordance associated with significantly different hand posture and forearm muscle load. Devices that required less holding and were centrally placed associated with more neutral shoulder and hand postures, with significantly less forearm muscle load. For tablet interface, swipe locations closer to the palm had significantly smaller forearm muscle load and more neutral posture across wrist and thumb joints.
Conclusion: Through empirical results described in the dissertation, we demonstrated how users’ upper extremity biomechanics can provide insights into the complex interactions between users and modern computer workstations, both as a whole and with specific components. For technology innovation, ergonomics concepts and methodologies can be used to design future generation technologies that fit users’ physical capabilities to reduce MSDs risk while promoting performance
Leveraging finger identification to integrate multi-touch command selection and parameter manipulation
International audienceIdentifying which fingers are touching a multi-touch surface provides a very large input space. We describe FingerCuts, an interaction technique inspired by desktop keyboard shortcuts to exploit this potential. FingerCuts enables integrated command selection and parameter manipulation, it uses feed-forward and feedback to increase discoverability, it is backward compatible with current touch input techniques, and it is adaptable for different touch device form factors. We implemented three variations of FingerCuts, each tailored to a different device form factor: tabletop, tablet, and smartphone. Qualitative and quantitative studies conducted on the tabletop suggests that with some practice, FingerCuts is expressive, easy-to-use, and increases a sense of continuous interaction flow and that interaction with FingerCuts is as fast, or faster than using a graphical user interface. A theoretical analysis of FingerCuts using the Fingerstroke-Level Model (FLM) matches our quantitative study results, justifying our use of FLM to analyse and validate the performance for the other device form factors
Barehand Mode Switching in Touch and Mid-Air Interfaces
Raskin defines a mode as a distinct setting within an interface where the same user input will produce results different to those it would produce in other settings. Most interfaces have multiple modes in which input is mapped to different actions, and, mode-switching is simply the transition from one mode to another. In touch interfaces, the current mode can change how a single touch is interpreted: for example, it could draw a line, pan the canvas, select a shape, or enter a command. In Virtual Reality (VR), a hand gesture-based 3D modelling application may have different modes for object creation, selection, and transformation. Depending on the mode, the movement of the hand is interpreted differently. However, one of the crucial factors determining the effectiveness of an interface is user productivity. Mode-switching time of different input techniques, either in a touch interface or in a mid-air interface, affects user productivity. Moreover, when touch and mid-air interfaces like VR are combined, making informed decisions pertaining to the mode assignment gets even more complicated. This thesis provides an empirical investigation to characterize the mode switching phenomenon in barehand touch-based and mid-air interfaces. It explores the potential of using these input spaces together for a productivity application in VR. And, it concludes with a step towards defining and evaluating the multi-faceted mode concept, its characteristics and its utility, when designing user interfaces more generally
Thumb + Pen Interaction on Tablets
ABSTRACT Modern tablets support simultaneous pen and touch input, but it remains unclear how to best leverage this capability for bimanual input when the nonpreferred hand holds the tablet. We explore Thumb + Pen interactions that support simultaneous pen and touch interaction, with both hands, in such situations. Our approach engages the thumb of the device-holding hand, such that the thumb interacts with the touch screen in an indirect manner, thereby complementing the direct input provided by the preferred hand. For instance, the thumb can determine how pen actions (articulated with the opposite hand) are interpreted. Alternatively, the pen can point at an object, while the thumb manipulates one or more of its parameters through indirect touch. Our techniques integrate concepts in a novel way that derive from marking menus, spring-loaded modes, indirect input, and multi-touch conventions. Our overall approach takes the form of a set of probes, each representing a meaningfully distinct class of application. They serve as an initial exploration of the design space at a level which will help determine the feasibility of supporting bimanual interaction in such contexts, and the viability of the Thumb + Pen techniques in so doing
- …