3,997 research outputs found

    Mobile augmented reality based 3D snapshots

    Get PDF
    We describe a mobile augmented reality application that is based on 3D snapshotting using multiple photographs. Optical square markers provide the anchor for reconstructed virtual objects in the scene. A novel approach based on pixel flow highly improves tracking performance. This dual tracking approach also allows for a new single-button user interface metaphor for moving virtual objects in the scene. The development of the AR viewer was accompanied by user studies confirming the chosen approach

    DualStream: Spatially Sharing Selves and Surroundings using Mobile Devices and Augmented Reality

    Full text link
    In-person human interaction relies on our spatial perception of each other and our surroundings. Current remote communication tools partially address each of these aspects. Video calls convey real user representations but without spatial interactions. Augmented and Virtual Reality (AR/VR) experiences are immersive and spatial but often use virtual environments and characters instead of real-life representations. Bridging these gaps, we introduce DualStream, a system for synchronous mobile AR remote communication that captures, streams, and displays spatial representations of users and their surroundings. DualStream supports transitions between user and environment representations with different levels of visuospatial fidelity, as well as the creation of persistent shared spaces using environment snapshots. We demonstrate how DualStream can enable spatial communication in real-world contexts, and support the creation of blended spaces for collaboration. A formative evaluation of DualStream revealed that users valued the ability to interact spatially and move between representations, and could see DualStream fitting into their own remote communication practices in the near future. Drawing from these findings, we discuss new opportunities for designing more widely accessible spatial communication tools, centered around the mobile phone.Comment: 10 pages, 4 figures, 1 table; To appear in the proceedings of the IEEE International Symposium on Mixed and Augmented Reality (ISMAR) 202

    Piloting Multimodal Learning Analytics using Mobile Mixed Reality in Health Education

    Get PDF
    © 2019 IEEE. Mobile mixed reality has been shown to increase higher achievement and lower cognitive load within spatial disciplines. However, traditional methods of assessment restrict examiners ability to holistically assess spatial understanding. Multimodal learning analytics seeks to investigate how combinations of data types such as spatial data and traditional assessment can be combined to better understand both the learner and learning environment. This paper explores the pedagogical possibilities of a smartphone enabled mixed reality multimodal learning analytics case study for health education, focused on learning the anatomy of the heart. The context for this study is the first loop of a design based research study exploring the acquisition and retention of knowledge by piloting the proposed system with practicing health experts. Outcomes from the pilot study showed engagement and enthusiasm of the method among the experts, but also demonstrated problems to overcome in the pedagogical method before deployment with learners

    ConXsense - Automated Context Classification for Context-Aware Access Control

    Full text link
    We present ConXsense, the first framework for context-aware access control on mobile devices based on context classification. Previous context-aware access control systems often require users to laboriously specify detailed policies or they rely on pre-defined policies not adequately reflecting the true preferences of users. We present the design and implementation of a context-aware framework that uses a probabilistic approach to overcome these deficiencies. The framework utilizes context sensing and machine learning to automatically classify contexts according to their security and privacy-related properties. We apply the framework to two important smartphone-related use cases: protection against device misuse using a dynamic device lock and protection against sensory malware. We ground our analysis on a sociological survey examining the perceptions and concerns of users related to contextual smartphone security and analyze the effectiveness of our approach with real-world context data. We also demonstrate the integration of our framework with the FlaskDroid architecture for fine-grained access control enforcement on the Android platform.Comment: Recipient of the Best Paper Awar

    Augmented Reality in Astrophysics

    Full text link
    Augmented Reality consists of merging live images with virtual layers of information. The rapid growth in the popularity of smartphones and tablets over recent years has provided a large base of potential users of Augmented Reality technology, and virtual layers of information can now be attached to a wide variety of physical objects. In this article, we explore the potential of Augmented Reality for astrophysical research with two distinct experiments: (1) Augmented Posters and (2) Augmented Articles. We demonstrate that the emerging technology of Augmented Reality can already be used and implemented without expert knowledge using currently available apps. Our experiments highlight the potential of Augmented Reality to improve the communication of scientific results in the field of astrophysics. We also present feedback gathered from the Australian astrophysics community that reveals evidence of some interest in this technology by astronomers who experimented with Augmented Posters. In addition, we discuss possible future trends for Augmented Reality applications in astrophysics, and explore the current limitations associated with the technology. This Augmented Article, the first of its kind, is designed to allow the reader to directly experiment with this technology.Comment: 15 pages, 11 figures. Accepted for publication in Ap&SS. The final publication will be available at link.springer.co

    Mapping, sensing and visualising the digital co-presence in the public arena

    Get PDF
    This paper reports on work carried out within the Cityware project using mobile technologies to map, visualise and project the digital co-presence in the city. This paper focuses on two pilot studies exploring the Bluetooth landscape in the city of Bath. Here we apply adapted and ‘digitally augmented’ methods for spatial observation and analysis based on established methods used extensively in the space syntax approach to urban design. We map the physical and digital flows at a macro level and observe static space use at the micro level. In addition we look at social and mobile behaviour from an individual’s point of view. We apply a method based on intervention through ‘Sensing and projecting’ Bluetooth names and digital identity in the public arena. We present early findings in terms of patterns of Bluetooth flow and presence, and outline initial observations about how people’s reaction towards the projection of their Bluetooth names practices in public. In particular we note the importance of constructing socially meaningful relations between people mediated by these technologies. We discuss initial results and outline issues raised in detail before finally describing ongoing work
    • 

    corecore