4,665 research outputs found

    Tangible user interfaces : past, present and future directions

    Get PDF
    In the last two decades, Tangible User Interfaces (TUIs) have emerged as a new interface type that interlinks the digital and physical worlds. Drawing upon users' knowledge and skills of interaction with the real non-digital world, TUIs show a potential to enhance the way in which people interact with and leverage digital information. However, TUI research is still in its infancy and extensive research is required in or- der to fully understand the implications of tangible user interfaces, to develop technologies that further bridge the digital and the physical, and to guide TUI design with empirical knowledge. This paper examines the existing body of work on Tangible User In- terfaces. We start by sketching the history of tangible user interfaces, examining the intellectual origins of this field. We then present TUIs in a broader context, survey application domains, and review frame- works and taxonomies. We also discuss conceptual foundations of TUIs including perspectives from cognitive sciences, phycology, and philoso- phy. Methods and technologies for designing, building, and evaluating TUIs are also addressed. Finally, we discuss the strengths and limita- tions of TUIs and chart directions for future research

    Collaboration in Augmented Reality: How to establish coordination and joint attention?

    Get PDF
    Schnier C, Pitsch K, Dierker A, Hermann T. Collaboration in Augmented Reality: How to establish coordination and joint attention? In: Boedker S, Bouvin NO, Lutters W, Wulf V, Ciolfi L, eds. Proceedings of the 12th European Conference on Computer Supported Cooperative Work (ECSCW 2011). Springer-Verlag London; 2011: 405-416.We present an initial investigation from a semi-experimental setting, in which an HMD-based AR-system has been used for real-time collaboration in a task-oriented scenario (design of a museum exhibition). Analysis points out the specific conditions of interacting in an AR environment and focuses on one particular practical problem for the participants in coordinating their interaction: how to establish joint attention towards the same object or referent. Analysis allows insights into how the pair of users begins to familarize with the environment, the limitations and opportunities of the setting and how they establish new routines for e.g. solving the ʻjoint attentionʼ-problem

    An augmented reality interface for visualising and interacting with virtual content

    Get PDF
    In this paper, a novel AR interface is proposed that provides generic solutions to the tasks involved in augmenting simultaneously different types of virtual information and processing of tracking data for natural interaction. Participants within the system can experience a real-time mixture of 3D objects, static video, images, textual information and 3D sound with the real environment. The userfriendly AR interface can achieve maximum interaction using simple but effective forms of collaboration based on the combinations of humancomputer interaction techniques. To prove the feasibility of the interface, the use of indoor AR techniques are employed to construct innovative applications and demonstrate examples from heritage to learning systems. Finally, an initial evaluation of the AR interface including some initial results is presented

    Visualyzart Project – The role in education

    Get PDF
    The VisualYzARt project intends to develop research on mobile platforms, web and social scenarios in order to bring augmented reality and natural interaction for the general public, aiming to study and validate the adequacy of YVision platform in various fields of activity such as digital arts, design, education, culture and leisure. The VisualYzARt project members analysed the components available in YVision platform and are defining new ones that allow the creation of applications to a chosen activity, effectively adding a new language to the domain YVision. In this paper we will present the role of the InstitutoPolitécnico de Santarém which falls into the field of education.VisualYzART is funded by QREN – Sistema de Incentivos à Investigação e Desenvolvimento Tecnológico (SI I&DT), Project n. º 23201 - VisualYzARt (from January 2013 to December 2014). Partners: YDreams Portugal; Instituto Politécnico de Santarém - Gabinete de e-Learning; Universidade de Coimbra - Centro de Informática e Sistemas; Instituto Politécnico de Leiria - Centro de Investigação em Informática e Comunicações; Universidade Católica do Porto - Centro de Investigação em Ciência e Tecnologia das Artes.info:eu-repo/semantics/publishedVersio

    ISAR: Ein Autorensystem für Interaktive Tische

    Get PDF
    Developing augmented reality systems involves several challenges, that prevent end users and experts from non-technical domains, such as education, to experiment with this technology. In this research we introduce ISAR, an authoring system for augmented reality tabletops targeting users from non-technical domains. ISAR allows non-technical users to create their own interactive tabletop applications and experiment with the use of this technology in domains such as educations, industrial training, and medical rehabilitation.Die Entwicklung von Augmented-Reality-Systemen ist mit mehreren Herausforderungen verbunden, die Endbenutzer und Experten aus nicht-technischen Bereichen, wie z.B. dem Bildungswesen, daran hindern, mit dieser Technologie zu experimentieren. In dieser Forschung stellen wir ISAR vor, ein Autorensystem für Augmented-Reality-Tabletops, das sich an Benutzer aus nicht-technischen Bereichen richtet. ISAR ermöglicht es nicht-technischen Anwendern, ihre eigenen interaktiven Tabletop-Anwendungen zu erstellen und mit dem Einsatz dieser Technologie in Bereichen wie Bildung, industrieller Ausbildung und medizinischer Rehabilitation zu experimentieren

    Using Haar-like feature classifiers for hand tracking in tabletop augmented reality

    Get PDF
    We propose in this paper a hand interaction approach to Augmented Reality Tabletop applications. We detect the user’s hands using haar-like feature classifiers and correlate its positions with the fixed markers on the table. This gives the user the possibility to move, rotate and resize the virtual objects located over the table with their bare hands.Postprint (published version

    Design experiences of multimodal mixed reality interfaces

    Get PDF
    corecore