2,854 research outputs found

    An integrated task manager for virtual command and control

    Get PDF
    The Task Manager is a desktop/tablet PC interface to the Battlespace research project that provides interactions and displays for supervisory control of unmanned aerial vehicles. Utilizing a north-up map display, the Task Manager provides a direct-manipulation interface to the units involved in an engagement. Used in two primary modes, the Task Manager can be used either in a planning/review mode that can be used to generate mission scenarios or a live-streaming mode that connects to a live Battlespace simulation via a network connection to edit and update path information on the fly. The goal of this research is to combine the precision of 2D mouse and pen-based interaction with the increased situational awareness provided by 3D battlefield visualizations like the Battlespace application. Combined use of these interfaces, either by a single operator or a small team of operators with task-specific roles, is proposed to produce a more favorable ratio of operators to units in field operations with superior decision-making capabilities due to the specific nature of the interfaces

    An Introduction to 3D User Interface Design

    Get PDF
    3D user interface design is a critical component of any virtual environment (VE) application. In this paper, we present a broad overview of three-dimensional (3D) interaction and user interfaces. We discuss the effect of common VE hardware devices on user interaction, as well as interaction techniques for generic 3D tasks and the use of traditional two-dimensional interaction styles in 3D environments. We divide most user interaction tasks into three categories: navigation, selection/manipulation, and system control. Throughout the paper, our focus is on presenting not only the available techniques, but also practical guidelines for 3D interaction design and widely held myths. Finally, we briefly discuss two approaches to 3D interaction design, and some example applications with complex 3D interaction requirements. We also present an annotated online bibliography as a reference companion to this article

    Improving Situational Awareness in Military Operations using Augmented Reality

    Get PDF
    During military operations, the battlefields become fractured zones where the level of confusion, noise and ambiguity impact on achieving tactical objectives. Situational Awareness (SA) becomes a challenge because the unstable perception of the situation leads to a degraded understanding that disables the soldier in projecting the proper results. To meet this challenge various military projects have focused their efforts on designing integrated digital system to support decision-making for military personnel in unknown environments. This paper presents the state of art of military systems using Augmented Reality (AR) in the battlefield.Facultad de Informátic

    Towards Live 3D Reconstruction from Wearable Video: An Evaluation of V-SLAM, NeRF, and Videogrammetry Techniques

    Full text link
    Mixed reality (MR) is a key technology which promises to change the future of warfare. An MR hybrid of physical outdoor environments and virtual military training will enable engagements with long distance enemies, both real and simulated. To enable this technology, a large-scale 3D model of a physical environment must be maintained based on live sensor observations. 3D reconstruction algorithms should utilize the low cost and pervasiveness of video camera sensors, from both overhead and soldier-level perspectives. Mapping speed and 3D quality can be balanced to enable live MR training in dynamic environments. Given these requirements, we survey several 3D reconstruction algorithms for large-scale mapping for military applications given only live video. We measure 3D reconstruction performance from common structure from motion, visual-SLAM, and photogrammetry techniques. This includes the open source algorithms COLMAP, ORB-SLAM3, and NeRF using Instant-NGP. We utilize the autonomous driving academic benchmark KITTI, which includes both dashboard camera video and lidar produced 3D ground truth. With the KITTI data, our primary contribution is a quantitative evaluation of 3D reconstruction computational speed when considering live video.Comment: Accepted to 2022 Interservice/Industry Training, Simulation, and Education Conference (I/ITSEC), 13 page

    LVC Interaction within a Mixed Reality Training System

    Get PDF
    The United States military is increasingly pursuing advanced live, virtual, and constructive (LVC) training systems for reduced cost, greater training flexibility, and decreased training times. Combining the advantages of realistic training environments and virtual worlds, mixed reality LVC training systems can enable live and virtual trainee interaction as if co-located. However, LVC interaction in these systems often requires constructing immersive environments, developing hardware for live-virtual interaction, tracking in occluded environments, and an architecture that supports real-time transfer of entity information across many systems. This paper discusses a system that overcomes these challenges to empower LVC interaction in a reconfigurable, mixed reality environment. This system was developed and tested in an immersive, reconfigurable, and mixed reality LVC training system for the dismounted warfighter at ISU, known as the Veldt, to overcome LVC interaction challenges and as a test bed for cuttingedge technology to meet future U.S. Army battlefield requirements. Trainees interact physically in the Veldt and virtually through commercial and developed game engines. Evaluation involving military trained personnel found this system to be effective, immersive, and useful for developing the critical decision-making skills necessary for the battlefield. Procedural terrain modeling, model-matching database techniques, and a central communication server process all live and virtual entity data from system components to create a cohesive virtual world across all distributed simulators and game engines in real-time. This system achieves rare LVC interaction within multiple physical and virtual immersive environments for training in real-time across many distributed systems
    • …
    corecore