79,254 research outputs found

    Rethinking 'multi-user': an in-the-wild study of how groups approach a walk-up-and-use tabletop interface

    Get PDF
    Multi-touch tabletops have been much heralded as an innovative technology that can facilitate new ways of group working. However, there is little evidence of these materialising outside of research lab settings. We present the findings of a 5-week in-the-wild study examining how a shared planning application – designed to run on a walk-up- and-use tabletop – was used when placed in a tourist information centre. We describe how groups approached, congregated and interacted with it and the social interactions that took place – noting how they were quite different from research findings describing the ways groups work around a tabletop in lab settings. We discuss the implications of such situated group work for designing collaborative tabletop applications for use in public settings

    Ambient Gestures

    No full text
    We present Ambient Gestures, a novel gesture-based system designed to support ubiquitous ‘in the environment’ interactions with everyday computing technology. Hand gestures and audio feedback allow users to control computer applications without reliance on a graphical user interface, and without having to switch from the context of a non-computer task to the context of the computer. The Ambient Gestures system is composed of a vision recognition software application, a set of gestures to be processed by a scripting application and a navigation and selection application that is controlled by the gestures. This system allows us to explore gestures as the primary means of interaction within a multimodal, multimedia environment. In this paper we describe the Ambient Gestures system, define the gestures and the interactions that can be achieved in this environment and present a formative study of the system. We conclude with a discussion of our findings and future applications of Ambient Gestures in ubiquitous computing

    Adaptive User Perspective Rendering for Handheld Augmented Reality

    Full text link
    Handheld Augmented Reality commonly implements some variant of magic lens rendering, which turns only a fraction of the user's real environment into AR while the rest of the environment remains unaffected. Since handheld AR devices are commonly equipped with video see-through capabilities, AR magic lens applications often suffer from spatial distortions, because the AR environment is presented from the perspective of the camera of the mobile device. Recent approaches counteract this distortion based on estimations of the user's head position, rendering the scene from the user's perspective. To this end, approaches usually apply face-tracking algorithms on the front camera of the mobile device. However, this demands high computational resources and therefore commonly affects the performance of the application beyond the already high computational load of AR applications. In this paper, we present a method to reduce the computational demands for user perspective rendering by applying lightweight optical flow tracking and an estimation of the user's motion before head tracking is started. We demonstrate the suitability of our approach for computationally limited mobile devices and we compare it to device perspective rendering, to head tracked user perspective rendering, as well as to fixed point of view user perspective rendering

    Mapping Tasks to Interactions for Graph Exploration and Graph Editing on Interactive Surfaces

    Full text link
    Graph exploration and editing are still mostly considered independently and systems to work with are not designed for todays interactive surfaces like smartphones, tablets or tabletops. When developing a system for those modern devices that supports both graph exploration and graph editing, it is necessary to 1) identify what basic tasks need to be supported, 2) what interactions can be used, and 3) how to map these tasks and interactions. This technical report provides a list of basic interaction tasks for graph exploration and editing as a result of an extensive system review. Moreover, different interaction modalities of interactive surfaces are reviewed according to their interaction vocabulary and further degrees of freedom that can be used to make interactions distinguishable are discussed. Beyond the scope of graph exploration and editing, we provide an approach for finding and evaluating a mapping from tasks to interactions, that is generally applicable. Thus, this work acts as a guideline for developing a system for graph exploration and editing that is specifically designed for interactive surfaces.Comment: 21 pages, minor corrections (typos etc.

    PickCells: A Physically Reconfigurable Cell-composed Touchscreen

    Get PDF
    Touchscreens are the predominant medium for interactions with digital services; however, their current fixed form factor narrows the scope for rich physical interactions by limiting interaction possibilities to a single, planar surface. In this paper we introduce the concept of PickCells, a fully reconfigurable device concept composed of cells, that breaks the mould of rigid screens and explores a modular system that affords rich sets of tangible interactions and novel acrossdevice relationships. Through a series of co-design activities – involving HCI experts and potential end-users of such systems – we synthesised a design space aimed at inspiring future research, giving researchers and designers a framework in which to explore modular screen interactions. The design space we propose unifies existing works on modular touch surfaces under a general framework and broadens horizons by opening up unexplored spaces providing new interaction possibilities. In this paper, we present the PickCells concept, a design space of modular touch surfaces, and propose a toolkit for quick scenario prototyping
    corecore