23 research outputs found

    Read-It: A Multi-modal Tangible Interface for Children Who Learn to Read

    Full text link
    Multi-modal tabletop applications offer excellent opportunities for enriching the education of young children. Read-It is an example of an interactive game with a multi-modal tangible interface that was designed to combine the advantages of current physical games and computer exercises. It is a novel approach for supporting children who learn to read. The first experimental evaluation has demonstrated that the Read-It approach is indeed promising and meets a priori expectations

    Bridging Digital and Physical Worlds Using Tangible Drag-and-Drop Interfaces

    Get PDF
    The last ten years have seen an explosion in the diversity of digital-life devices, e.g. music and video players. However, the interaction paradigm to use these devices has remained mostly unchanged. Remote controls are still the most common way to manage a digital-life device. Moreover, the interaction between devices themselves is still very limited and rarely addressed by a remote control interface. We present in this paper a study of tangible drag-and-drop, a remote control interface based on the well-known paradigm coming from the graphical user interface. This interaction technique aims at reducing the gap between the digital and physical worlds, enabling the transfer of digital data from one device to another. To validate such a concept, we present two prototypes, along with user studies and a general discussion about the tangible drag-and-drop technique

    Handheld AR for Collaborative Edutainment

    Get PDF
    Handheld Augmented Reality (AR) is expected to provide ergonomic, intuitive user interfaces for untrained users. Yet no comparative study has evaluated these assumptions against more traditional user interfaces for an education task. In this paper we compare the suitability of a handheld AR arts-history learning game against more traditional variants. We present results from a user study that demonstrate not only the effectiveness of AR for untrained users but also its fun-factor and suitability in environments such as public museums. Based on these results we provide design guidelines that can inform the design of future collaborative handheld AR applications

    Interacting with a Tabletop Display Using a Camera Equipped Mobile Phone

    No full text

    Putting location-based services on the map

    No full text
    Abstract Location-based services for users on the move provide a convenient means of filtering information based on current geographical position. However users also often want to retrieve or capture information associated with past or future locations. We show how new technologies for interactive paper can be used to augment conventional paper maps with location-based services using a combination of user tracking and pointing to the map to specify location.

    A Middleware-Based Application Framework for Active Space Applications

    No full text
    Ubiquitous computing challenges the conventional notion of a user logged into a personal computing device, whether it is a desktop, a laptop, or a digital assistant. When the physical environment of a user contains hundreds of networked computer devices each of which may be used to support one or more user applications, the notion of personal computing becomes inadequate. Further, when a group of users share such a physical environment, new forms of sharing, cooperation and collaboration are possible and mobile users may constantly change the computers with which they interact; we refer to these digitally augmented physical spaces as Active Spaces. We present in this paper an application framework that provides mechanisms to construct, run or adapt existing applications to ubiquitous computing environments. The framework binds applications to users, uses multiple devices simultaneously, and exploits resource management within the users' environment that reacts to context and mobility. Our research contributes to application mobility, partitioning and adaptation within device rich environments, and uses context-awareness to focus the resources of ubiquitous computing environments on the needs of users
    corecore