39 research outputs found

    Use of multi-touch gestures for capturing solution steps in arithmetic word problems

    Get PDF
    Multi-touch interfaces are becoming popular with tablet PCs and other multi-touch surfaces increasingly used in classrooms. Several studies have focused on the development of learning and collaboration potentials of these tools. However, assessment and feedback processes are yet to leverage on the new technologies to capture problem solving steps and strategies. This paper describes a computer aided assessment prototype tool that uses an innovative approach of multi-touch gestures to capture solution steps and strategies. It presents a preliminary effort to investigate the capturing of solution steps involving a two- step arithmetic word problem using the approach. The results suggest that it is possible to perform two step arithmetic work with multi-touch gestures and simultaneously capture solution processes. The steps captured provided detailed information on the students’ work which was used to study possible strategies adopted in solving the problems. This research suggests some practical implications for development of automated feedback and assessment systems and could serve as a base for future studies on effective strategies in arithmetic problem solving

    Knowledge-driven Biometric Authentication in Virtual Reality

    Get PDF
    With the increasing adoption of virtual reality (VR) in public spaces, protecting users from observation attacks is becoming essential to prevent attackers from accessing context-sensitive data or performing malicious payment transactions in VR. In this work, we propose RubikBiom, a knowledge-driven behavioural biometric authentication scheme for authentication in VR. We show that hand movement patterns performed during interactions with a knowledge-based authentication scheme (e.g., when entering a PIN) can be leveraged to establish an additional security layer. Based on a dataset gathered in a lab study with 23 participants, we show that knowledge-driven behavioural biometric authentication increases security in an unobtrusive way. We achieve an accuracy of up to 98.91% by applying a Fully Convolutional Network (FCN) on 32 authentications per subject. Our results pave the way for further investigations towards knowledge-driven behavioural biometric authentication in VR

    CIS : Evaluer les Techniques d'Interaction en Contexte

    Get PDF
    National audienceCet article donne un aperçu et un exemple d'utilisation du modèle CIS

    Toolglasses, marking menus, and hotkeys: a comparison of one and two-handed command selection techniques

    Get PDF
    This paper introduces a new input technique, bimanual marking menus, and compares its performance with five other techniques: static toolbars, hotkeys, grouped hotkeys, marking menus, and toolglasses. The study builds on previous work by setting the comparison in a commonly encountered task, shape drawing. In this context, grouped hotkeys and bimanual marking menus were found to be the fastest. Subjectively, the most preferred input method was bimanual marking menus. Toolglass performance was unexpectedly slow, which hints at the importance of low-level toolglass implementation choices. Key words: Bimanual interfaces, two-handed interfaces, toolglass, bimanual marking menus, command selection.

    The design of a GUI paradigm based on tablets, two-hands, and transparency

    Full text link
    An experimental GUI paradigm is presented which is based on the design goals of maximizing the amount of screen used for application data, reducing the amount that the UI diverts visual attentions from the application data, and increasing the quality of input. In pursuit of these goals, we integrated the non-standard UI technologies of multi-sensor tablets, toolglass, transparent UI components, and marking menus. We describe a working prototype of our new para-digm, the rationale behind it and our experiences introduc-ing it into an existing application. Finally, we presents some ot the lessons learned: prototypes are useful to break the barriers imposed by conventional GUI design and some of their ideas can still be retrofitted seamlessly into products. Furthermore, the added functionality is not measured only in terms of user performance, but also by the quality of interaction, which allows artists to create new graphic vocabularies and graphic styles

    Context matters: Evaluating Interaction Techniques with the CIS Model

    Get PDF
    International audienceThis article introduces the Complexity of Interaction Sequences model (CIS). CIS describes the structure of interaction techniques and the SimCIS simulator uses these descriptions to predict their performance in the context of an interaction sequence. The model defines the complexity of an interaction technique as a measure of its effectiveness within a given context. We tested CIS to compare three interaction techniques: fixed unimanual palettes, fixed bimanual palettes and toolglasses. The model predicts that the complexity of both palettes depends on interaction sequences, while toolglasses are less context-dependent. CIS also predicts that fixed bimanual palettes outperform the other two techniques. Predictions were tested empirically with a controlled experiment and confirmed the hypotheses. We argue that, in order to be generalizable, experimental comparisons of interaction techniques should include the concept of context sensitivity. CIS is a step in this direction as it helps predict the performance of interaction techniques according to the context of use

    BiTouch and BiPad: Designing Bimanual Interaction for Hand-held Tablets

    Get PDF
    International audienceDespite the demonstrated benefits of bimanual interaction, most tablets use just one hand for interaction, to free the other for support. In a preliminary study, we identified five holds that permit simultaneous support and interaction, and noted that users frequently change position to combat fatigue. We then designed the BiTouch design space, which introduces a support function in the kinematic chain model for interacting with hand-held tablets, and developed BiPad, a toolkit for creating bimanual tablet interaction with the thumb or the fingers of the supporting hand. We ran a controlled experiment to explore how tablet orientation and hand position affect three novel techniques: bimanual taps, gestures and chords. Bimanual taps outperformed our one-handed control condition in both landscape and portrait orientations; bimanual chords and gestures in portrait mode only; and thumbs outperformed fingers, but were more tiring and less stable. Together, BiTouch and BiPad offer new opportunities for designing bimanual interaction on hand-held tablets

    Tangible Interaction in Mixed Reality Systems

    Get PDF
    chapitre 6, ISBN 978-1-84882-732-5In this chapter, we discuss the design of tangible interaction techniques for Mixed Reality environments. We begin by recalling some conceptual models of tangible interaction. Then, we propose an engineering-oriented software/hardware co-design process, based on our experience in developing tangible user interfaces. We present three different tangible user interfaces for real-world applications, and analyse the feedback from the user studies that we conducted. In summary, we conclude that, since tangible user interfaces are part of the real world and provide a seamless interaction with virtual words, they are well-adapted to mix together reality and virtuality. Hence, tangible interaction optimizes a users' virtual tasks, especially in manipulating and controlling 3D digital data in 3D space
    corecore