2 research outputs found

    An Event-Based Data Distribution Mechanism for Collaborative Mobile Augmented Reality and Virtual Environments

    No full text
    The full power of mobile augmented and virtual reality systems is realized when these systems are connected to one another, to immersive virtual environments, and to remote information servers. Connections are usually made through wireless networks. However, wireless networks cannot guarantee connectivity and their bandwidth can be highly constrained. In this paper we present a robust eventbased data distribution mechanism for mobile augmented reality and virtual environments. It is based on replicated databases, pluggable networking protocols, and communication channels. We demonstrate the mechanism in the Battlefield Augmented Reality System (BARS) situation awareness system, which is composed of several mobile augmented reality systems, immersive and desktop-based virtual reality systems, a 2D map-based multi-modal system, handheld PCs, and other sources of information

    ARVISCOPE: Georeferenced Visualization of Dynamic Construction Processes in Three-Dimensional Outdoor Augmented Reality.

    Full text link
    Construction processes can be conceived as systems of discrete, interdependent activities. Discrete Event Simulation (DES) has thus evolved as an effective tool to model operations that compete over available resources (personnel, material, and equipment). A DES model has to be verified and validated to ensure that it reflects a modeler’s intentions, and faithfully represents a real operation. 3D visualization is an effective means of achieving this, and facilitating the process of communicating and accrediting simulation results. Visualization of simulated operations has traditionally been achieved in Virtual Reality (VR). In order to create convincing VR animations, detailed information about an operation and the environment has to be obtained. The data must describe the simulated processes, and provide 3D CAD models of project resources, the facility under construction, and the surrounding terrain (Model Engineering). As the size and complexity of an operation increase, such data collection becomes an arduous, impractical, and often impossible task. This directly translates into loss of financial and human resources that could otherwise be productively used. In an effort to remedy this situation, this dissertation proposes an alternate approach of visualizing simulated operations using Augmented Reality (AR) to create mixed views of real existing jobsite facilities and virtual CAD models of construction resources. The application of AR in animating simulated operations has significant potential in reducing the aforementioned Model Engineering and data collection tasks, and at the same time can help in creating visually convincing output that can be effectively communicated. This dissertation presents the design, methodology, and development of ARVISCOPE, a general purpose AR animation authoring language, and ROVER, a mobile computing hardware framework. When used together, ARVISCOPE and ROVER can create three-dimensional AR animations of any length and complexity from the results of running DES models of engineering operations. ARVISCOPE takes advantage of advanced Global Positioning System (GPS) and orientation tracking technologies to accurately track a user’s spatial context, and georeferences superimposed 3D graphics in an augmented environment. In achieving the research objectives, major technical challenges such as accurate registration, automated occlusion handling, and dynamic scene construction and manipulation have been successfully identified and addressed.Ph.D.Civil EngineeringUniversity of Michigan, Horace H. Rackham School of Graduate Studieshttp://deepblue.lib.umich.edu/bitstream/2027.42/60761/1/abehzada_1.pd
    corecore