44 research outputs found

    Getting back to basics : bimanual interaction on mobile touch screen devices

    Get PDF
    The availability, and popularity, of touch screen tablets is drastically increasing with over 30% of internet users now owning one. However the lack of bimanual interaction in touch screen tablets is presenting product designers with serious challenges. Several attempts have been made to facilitate bimanual interaction in such products but results are not comparable to that of their non-mobile cousins, e.g. laptops. This paper presents the finding of a group collaboration aimed at prototyping a mobile touch screen device which supports bimanual interaction during internet browser navigation through rear mounted inputs. The researchers found it problematic to add basic bimanual interactions for internet browser navigation to the rear of a prototype mobile touch screen device due to issues regarding grip type, finger movement and hand position. This paper concludes that in order to achieve bimanual interaction researchers need to return to basics and consider how to free the hand and fingers from current constraints

    Getting Back To Basics: Bimanual Interaction on Mobile Touch Screen Devices

    Full text link

    Investigating how the hand interacts with different mobile phones

    Get PDF
    In this paper we investigate the physical interaction between the hand and three types of mobile device interaction: touchscreen, physical keyboard and stylus. Through a controlled study using video observational analysis, we observed firstly, how the participants gripped the three devices and how these grips were device dependent. Secondly we looked closely at these grips to uncover how participants performed what we call micro-movements to facilitate a greater range of interaction, e.g. reaching across the keyboard. The results extend current knowledge by comparing three handheld device input methods and observing the movements, which the hand makes in five grips. The paper concludes by describing the development of a conceptual design, proposed as a provocation for the opening of dialogue on how we conceive hand usage and how it might be optimized when designed for mobile devices

    Smart Environments for Collaborative Design, Implementation, and Interpretation of Scientific Experiments

    Get PDF
    Ambient intelligence promises to enable humans to smoothly interact with their environment, mediated by computer technology. In the literature on ambient intelligence, empirical scientists are not often mentioned. Yet they form an interesting target group for this technology. In this position paper, we describe a project aimed at realising an ambient intelligence environment for face-to-face meetings of researchers with different academic backgrounds involved in molecular biology “omics” experiments. In particular, microarray experiments are a focus of attention because these experiments require multidisciplinary collaboration for their design, analysis, and interpretation. Such an environment is characterised by a high degree of complexity that has to be mitigated by ambient intelligence technology. By experimenting in a real-life setting, we will learn more about life scientists as a user group

    An evaluation of asymmetric interfaces for bimanual virtual assembly with haptics

    Get PDF
    Immersive computing technology provides a human–computer interface to support natural human interaction with digital data and models. One application for this technology is product assembly methods planning and validation. This paper presents the results of a user study which explores the effectiveness of various bimanual interaction device configurations for virtual assembly tasks. Participants completed two assembly tasks with two device configurations in five randomized bimanual treatment conditions (within subjects). A Phantom Omni® with and without haptics enabled and a 5DT Data Glove were used. Participant performance, as measured by time to assemble, was the evaluation metric. The results revealed that there was no significant difference in performance between the five treatment conditions. However, half of the participants chose the 5DT Data Glove and the haptic-enabled Phantom Omni® as their preferred device configuration. In addition, qualitative comments support both the preference of haptics during the assembly process and comments confirming Guiard’s kinematic chain model

    Understanding grip shifts:how form factors impact hand movements on mobile phones

    Get PDF
    In this paper we present an investigation into how hand usage is affected by different mobile phone form factors. Our initial (qualitative) study explored how users interact with various mobile phone types (touchscreen, physical keyboard and stylus). The analysis of the videos revealed that each type of mobile phone affords specific handgrips and that the user shifts these grips and consequently the tilt and rotation of the phone depending on the context of interaction. In order to further investigate the tilt and rotation effects we conducted a controlled quantitative study in which we varied the size of the phone and the type of grips (Symmetric bimanual, Asymmetric bimanual with finger, Asymmetric bimanual with thumb and Single handed) to better understand how they affect the tilt and rotation during a dual pointing task. The results showed that the size of the phone does have a consequence and that the distance needed to reach action items affects the phones’ tilt and rotation. Additionally, we found that the amount of tilt, rotation and reach required corresponded with the participant’s grip preference. We finish the paper by discussing the design lessons for mobile UI and proposing design guidelines and applications for these insights

    Partially-indirect Bimanual Input with Gaze, Pen, and Touch for Pan, Zoom, and Ink Interaction

    Get PDF
    Bimanual pen and touch UIs are mainly based on the direct manipulation paradigm. Alternatively we propose partially- indirect bimanual input, where direct pen input is used with the dominant hand, and indirect-touch input with the non-dominant hand. As direct and indirect inputs do not overlap, users can interact in the same space without interference. We investigate two indirect-touch techniques combined with direct pen input: the first redirects touches to the user’s gaze position, and the second redirects touches to the pen position. In this paper, we present an empirical user study where we compare both partially-indirect techniques to direct pen and touch input in bimanual pan, zoom, and ink tasks. Our experimental results show that users are comparatively fast with the indirect techniques, but more accurate as users can dynamically change the zoom-target during indirect zoom gestures. Further our studies reveal that direct and indirect zoom gestures have distinct characteristics regarding spatial use, gestural use, and bimanual parallelism

    Gaze+RST: Integrating Gaze and Multitouch for Remote Rotate-Scale-Translate Tasks

    Get PDF
    Our work investigates the use of gaze and multitouch to fluidly perform rotate-scale-translate (RST) tasks on large displays. The work specifically aims to understand if gaze can provide benefit in such a task, how task complexity affects performance, and how gaze and multitouch can be combined to create an integral input structure suited to the task of RST. We present four techniques that individually strike a different balance between gaze-based and touch-based translation while maintaining concurrent rotation and scaling operations. A 16 participant empirical evaluation revealed that three of our four techniques present viable options for this scenario, and that larger distances and rotation/scaling operations can significantly affect a gaze-based translation configuration. Furthermore we uncover new insights regarding multimodal integrality, finding that gaze and touch can be combined into configurations that pertain to integral or separable input structures

    Multi-touch RST in 2D and 3D Spaces: Studying the Impact of Directness on User Performance

    Get PDF
    International audienceThe RST multi-touch technique allows one to simultaneously control \emph{Rotations}, \emph{Scaling}, and \emph{Translations} from multi-touch gestures. We conducted a user study to better understand the impact of directness on user performance for a RST docking task, for both 2D and 3D visualization conditions. This study showed that direct-touch shortens completion times, but indirect interaction improves efficiency and precision, and this is particularly true for 3D visualizations. The study also showed that users' trajectories are comparable for all conditions (2D/3D and direct/indirect). This tends to show that indirect RST control may be valuable for interactive visualization of 3D content. To illustrate this finding, we present a demo application that allows novice users to arrange 3D objects on a 2D virtual plane in an easy and efficient way

    MIDユーザインターフェイスプラットフォームの現状と展望

    Get PDF
    1. ALTOとユーザインターフェイス 2. ALTOベースのパラダイムの限界 3. ARM系プラットフォームの狙う逆転のシナリオ 4. MIDと新たなユーザインターフェースへの挑戦 5. 新世代へのアプローチと新たな競争 6. 結
    corecore