74,745 research outputs found

    Mobile, collaborative augmented reality using cloudlets

    Get PDF
    The evolution in mobile applications to support advanced interactivity and demanding multimedia features is still ongoing. Novel application concepts (e.g. mobile Augmented Reality (AR)) are however hindered by the inherently limited resources available on mobile platforms (not withstanding the dramatic performance increases of mobile hardware). Offloading resource intensive application components to the cloud, also known as "cyber foraging", has proven to be a valuable solution in a variety of scenarios. However, also for collaborative scenarios, in which data together with its processing are shared between multiple users, this offloading concept is highly promising. In this paper, we investigate the challenges posed by offloading collaborative mobile applications. We present a middleware platform capable of autonomously deploying software components to minimize average CPU load, while guaranteeing smooth collaboration. As a use case, we present and evaluate a collaborative AR application, offering interaction between users, the physical environment as well as with the virtual objects superimposed on this physical environment

    Collaborative Augmented Reality

    Get PDF
    Over the past number of years augmented reality (AR) has become an increasingly pervasive as a consumer level technology. The principal drivers of its recent development has been the evolution of mobile and handheld devices, in conjunction with algorithms and techniques from fields such as 3D computer vision. Various commercial platforms and SDKs are now available that allow developers to quickly develop mobile AR apps requiring minimal understanding of the underlying technology. Much of the focus to date, both in the research and commercial environment, has been on single user AR applications. Just as collaborative mobile applications have a demonstrated role in the increasing popularity of mobile devices, and we believe collaborative AR systems present a compelling use-case for AR technology. The aim of this thesis is the development a mobile collaborative augmented reality framework. We identify the elements required in the design and implementation stages of collaborative AR applications. Our solution enables developers to easily create multi-user mobile AR applications in which the users can cooperatively interact with the real environment in real time. It increases the sense of collaborative spatial interaction without requiring complex infrastructure. Assuming the given low level communication and AR libraries have modular structures, the proposed approach is also modular and flexible enough to adapt to their requirements without requiring any major changes

    Collaborative Augmented Reality

    Get PDF
    Over the past number of years augmented reality (AR) has become an increasingly pervasive as a consumer level technology. The principal drivers of its recent development has been the evolution of mobile and handheld devices, in conjunction with algorithms and techniques from fields such as 3D computer vision. Various commercial platforms and SDKs are now available that allow developers to quickly develop mobile AR apps requiring minimal understanding of the underlying technology. Much of the focus to date, both in the research and commercial environment, has been on single user AR applications. Just as collaborative mobile applications have a demonstrated role in the increasing popularity of mobile devices, and we believe collaborative AR systems present a compelling use-case for AR technology. The aim of this thesis is the development a mobile collaborative augmented reality framework. We identify the elements required in the design and implementation stages of collaborative AR applications. Our solution enables developers to easily create multi-user mobile AR applications in which the users can cooperatively interact with the real environment in real time. It increases the sense of collaborative spatial interaction without requiring complex infrastructure. Assuming the given low level communication and AR libraries have modular structures, the proposed approach is also modular and flexible enough to adapt to their requirements without requiring any major changes

    Tell me, show me, involve me: Supercharging Collaborative Diagnosis with Augmented Reality

    Get PDF
    Augmented reality has been broadly employed to help remote individuals communicate and coordinate. In this study, we develop and test a model that explains how augmented reality can facilitate collaborative diagnosis on an unexpected technical breakdown involving two complete strangers. Drawing on the affordance theory, we integrate the dual-task interference literature to reveal frustration valence and arousal as the underlying mechanisms. We tested our hypothesis in a laboratory experiment involving a custom-built augmented reality environment and physiological measurements. Overall, this study contributes to information system literature, human-computer interaction literature, and dual-task interference research by unearthing the effects of augmented reality characteristics on enhancing collaborative diagnosis performance

    A component-based approach towards mobile distributed and collaborative PTAM

    Get PDF
    Having numerous sensors on-board, smartphones have rapidly become a very attractive platform for augmented reality applications. Although the computational resources of mobile devices grow, they still cannot match commonly available desktop hardware, which results in downscaled versions of well known computer vision techniques that sacrifice accuracy for speed. We propose a component-based approach towards mobile augmented reality applications, where components can be configured and distributed at runtime, resulting in a performance increase by offloading CPU intensive tasks to a server in the network. By sharing distributed components between multiple users, collaborative AR applications can easily be developed. In this poster, we present a component-based implementation of the Parallel Tracking And Mapping (PTAM) algorithm, enabling to distribute components to achieve a mobile, distributed version of the original PTAM algorithm, as well as a collaborative scenario

    3D and Augmented Reality (AR) Learning Modules for Ancient Civilization Represented Visually in Rock-cut Buddhist Caves in Yungang, China Budget

    Get PDF
    Budget for the MDC-FIU collaborative research project 3D and Augmented Reality (AR) Learning Module

    Dimensions of Mobile Augmented Reality for Learning: A First Inventory

    Get PDF
    Specht, M., Ternier, S., & Greller, W. (2011). Dimensions of Mobile Augmented Reality for Learning: A First Inventory. Journal of the Research for Educational Technology (RCET), 7(1), 117-127. Spring 2011.This article discusses technological developments and applications of mobile augmented reality (AR) and their application in learning. Augmented reality interaction design patterns are introduced and educational patterns for supporting certain learning objectives with AR approaches are discussed. The article then identifies several dimensions of a user context identified with sensors contained in mobile devices and used for the contextualization of learning experiences. Finally, an AR game concept, “Locatory”, is presented that combines a game logic with collaborative game play and personalized mobile augmented reality visualization

    Outcome Evaluation Metrics: 3D and Augmented Reality (AR) Learning Modules for Ancient Civilization Represented Visually in Rock-cut Buddhist Caves in Yungang, China

    Get PDF
    Evaluation metrics for the MDC-FIU collaborative the research project 3D and Augmented Reality (AR) Learning Module

    Investigating Spatial Augmented Reality for Collaborative Design

    Get PDF
    corecore