4 research outputs found

    Design and implement chords and personal windows for multi-user collaboration on a large multi-touch vertical display

    No full text
    Co-located collaboration on large vertical screens has become technically feasible, but users are faced with increased effort, or have to wear intrusive personal identifiers. Previous research on co-located collaboration has assumed that all users perform exactly the same task (e.g., moving and resizing photos), or that they negotiate individual actions in turns. However, there is limited user interface software that supports simultaneous performance of individual actions during shared tasks (Fig. 1a). As a remedy, we have introduced multi-touch chords (Fig. 1b) and personal action windows (Fig. 1c) for co-located collaboration on a large multi-touch vertical display. Instead of selecting an item in a fixed menu by reaching for it, users work simultaneously on shared tasks by means of personal action windows, which are triggered by multi-touch chords performed anywhere on the display. In order to evaluate the proposed technique with users, we introduced an experimental task, which stands for the group dynamics that emerge during shared tasks on a large display. A grounded theory analysis of users’ behaviour provided insights into established co-located collaboration topics, such as conflict resolution strategies and space negotiation. The main contribution of this work is the design and implementation of a novel seamless identification and interaction technique that supports diverse multi-touch interactions by multiple users: multi-touch chord interaction along with personal action windows

    Design and implement chords and personal windows for multi-user collaboration on a large multi-touch vertical display

    Get PDF
    Co-located collaboration on large vertical screens has become technically feasible, but users are faced with increased effort, or have to wear intrusive personal identifiers. Previous research on co-located collaboration has assumed that all users perform exactly the same task (e.g., moving and resizing photos), or that they negotiate individual actions in turns. However, there is limited user interface software that supports simultaneous performance of individual actions during shared tasks (Fig. 1a). As a remedy, we have introduced multi-touch chords (Fig. 1b) and personal action windows (Fig. 1c) for co-located collaboration on a large multi-touch vertical display. Instead of selecting an item in a fixed menu by reaching for it, users work simultaneously on shared tasks by means of personal action windows, which are triggered by multi-touch chords performed anywhere on the display. In order to evaluate the proposed technique with users, we introduced an experimental task, which stands for the group dynamics that emerge during shared tasks on a large display. A grounded theory analysis of users’ behaviour provided insights into established co-located collaboration topics, such as conflict resolution strategies and space negotiation. The main contribution of this work is the design and implementation of a novel seamless identification and interaction technique that supports diverse multi-touch interactions by multiple users: multi-touch chord interaction along with personal action windows

    Collaborative behavior, performance and engagement with visual analytics tasks using mobile devices

    Get PDF
    Interactive visualizations are external tools that can support users’ exploratory activities. Collaboration can bring benefits to the exploration of visual representations or visu‐ alizations. This research investigates the use of co‐located collaborative visualizations in mobile devices, how users working with two different modes of interaction and view (Shared or Non‐Shared) and how being placed at various position arrangements (Corner‐to‐Corner, Face‐to‐Face, and Side‐by‐Side) affect their knowledge acquisition, engagement level, and learning efficiency. A user study is conducted with 60 partici‐ pants divided into 6 groups (2 modes×3 positions) using a tool that we developed to support the exploration of 3D visual structures in a collaborative manner. Our results show that the shared control and view version in the Side‐by‐Side position is the most favorable and can improve task efficiency. In this paper, we present the results and a set of recommendations that are derived from them
    corecore