2,278 research outputs found

    Urban Air Mobility System Testbed Using CAVE Virtual Reality Environment

    Get PDF
    Urban Air Mobility (UAM) refers to a system of air passenger and small cargo transportation within an urban area. The UAM framework also includes other urban Unmanned Aerial Systems (UAS) services that will be supported by a mix of onboard, ground, piloted, and autonomous operations. Over the past few years UAM research has gained wide interest from companies and federal agencies as an on-demand innovative transportation option that can help reduce traffic congestion and pollution as well as increase mobility in metropolitan areas. The concepts of UAM/UAS operation in the National Airspace System (NAS) remains an active area of research to ensure safe and efficient operations. With new developments in smart vehicle design and infrastructure for air traffic management, there is a need for methods to integrate and test various components of the UAM framework. In this work, we report on the development of a virtual reality (VR) testbed using the Cave Automatic Virtual Environment (CAVE) technology for human-automation teaming and airspace operation research of UAM. Using a four-wall projection system with motion capture, the CAVE provides an immersive virtual environment with real-time full body tracking capability. We created a virtual environment consisting of San Francisco city and a vertical take-off-and-landing passenger aircraft that can fly between a downtown location and the San Francisco International Airport. The aircraft can be operated autonomously or manually by a single pilot who maneuvers the aircraft using a flight control joystick. The interior of the aircraft includes a virtual cockpit display with vehicle heading, location, and speed information. The system can record simulation events and flight data for post-processing. The system parameters are customizable for different flight scenarios; hence, the CAVE VR testbed provides a flexible method for development and evaluation of UAM framework

    Utilizing Immersive Technologies in the Air Traffic Control Domain

    Get PDF
    The Federal Aviation Administration (FAA) holds a vital role in the United States, employing over 14,000 Air Traffic Control/Management (ATC/ATM) specialists responsible for managing roughly 43,000 flights each day. ATC education “wash-out” rates have shown that there is a disconnect between the training process and the implementation of cognitively demanding, safety-critical ATC duties. The purpose of this research was to investigate if, how, and where immersive technologies (i.e., augmented, virtual, and mixed reality) could be helpful within the ATC/ATM educational domain. To accomplish the overall research goal, subject matter expert (SME) interviews were conducted and a potential educational tool was developed and tested in two distinct research phases. Eighteen (N = 18) subjects volunteered to participate throughout both phases, and the tool was rated to be above average meaning the tool is usable in its current form; however, further development is suggested and expecte

    Games and Brain-Computer Interfaces: The State of the Art

    Get PDF
    BCI gaming is a very young field; most games are proof-of-concepts. Work that compares BCIs in a game environments with traditional BCIs indicates no negative effects, or even a positive effect of the rich visual environments on the performance. The low transfer-rate of current games poses a problem for control of a game. This is often solved by changing the goal of the game. Multi-modal input with BCI forms an promising solution, as does assigning more meaningful functionality to BCI control

    Prefrontal cortex activation upon a demanding virtual hand-controlled task: A new frontier for neuroergonomics

    Get PDF
    open9noFunctional near-infrared spectroscopy (fNIRS) is a non-invasive vascular-based functional neuroimaging technology that can assess, simultaneously from multiple cortical areas, concentration changes in oxygenated-deoxygenated hemoglobin at the level of the cortical microcirculation blood vessels. fNIRS, with its high degree of ecological validity and its very limited requirement of physical constraints to subjects, could represent a valid tool for monitoring cortical responses in the research field of neuroergonomics. In virtual reality (VR) real situations can be replicated with greater control than those obtainable in the real world. Therefore, VR is the ideal setting where studies about neuroergonomics applications can be performed. The aim of the present study was to investigate, by a 20-channel fNIRS system, the dorsolateral/ventrolateral prefrontal cortex (DLPFC/VLPFC) in subjects while performing a demanding VR hand-controlled task (HCT). Considering the complexity of the HCT, its execution should require the attentional resources allocation and the integration of different executive functions. The HCT simulates the interaction with a real, remotely-driven, system operating in a critical environment. The hand movements were captured by a high spatial and temporal resolution 3-dimensional (3D) hand-sensing device, the LEAP motion controller, a gesture-based control interface that could be used in VR for tele-operated applications. Fifteen University students were asked to guide, with their right hand/forearm, a virtual ball (VB) over a virtual route (VROU) reproducing a 42 m narrow road including some critical points. The subjects tried to travel as long as possible without making VB fall. The distance traveled by the guided VB was 70.2 ± 37.2 m. The less skilled subjects failed several times in guiding the VB over the VROU. Nevertheless, a bilateral VLPFC activation, in response to the HCT execution, was observed in all the subjects. No correlation was found between the distance traveled by the guided VB and the corresponding cortical activation. These results confirm the suitability of fNIRS technology to objectively evaluate cortical hemodynamic changes occurring in VR environments. Future studies could give a contribution to a better understanding of the cognitive mechanisms underlying human performance either in expert or non-expert operators during the simulation of different demanding/fatiguing activities.openCarrieri, Marika; Petracca, Andrea; Lancia, Stefania; Basso Moro, Sara; Brigadoi, Sabrina; Spezialetti, Matteo; Ferrari, Marco; Placidi, Giuseppe; Quaresima, ValentinaCarrieri, Marika; Petracca, Andrea; Lancia, Stefania; BASSO MORO, Sara; Brigadoi, Sabrina; Spezialetti, Matteo; Ferrari, Marco; Placidi, Giuseppe; Quaresima, Valentin

    A Testing and Experimenting Environment for Microscopic Traffic Simulation Utilizing Virtual Reality and Augmented Reality

    Get PDF
    Microscopic traffic simulation (MTS) is the emulation of real-world traffic movements in a virtual environment with various traffic entities. Typically, the movements of the vehicles in MTS follow some predefined algorithms, e.g., car-following models, lane changing models, etc. Moreover, existing MTS models only provide a limited capability of two- and/or three-dimensional displays that often restrict the user’s viewpoint to a flat screen. Their downscaled scenes neither provide a realistic representation of the environment nor allow different users to simultaneously experience or interact with the simulation model from different perspectives. These limitations neither allow the traffic engineers to effectively disseminate their ideas to various stakeholders of different backgrounds nor allow the analysts to have realistic data about the vehicle or pedestrian movements. This dissertation intends to alleviate those issues by creating a framework and a prototype for a testing environment where MTS can have inputs from user-controlled vehicles and pedestrians to improve their traffic entity movement algorithms as well as have an immersive M3 (multi-mode, multi-perspective, multi-user) visualization of the simulation using Virtual Reality (VR) and Augmented Reality (AR) technologies. VR environments are created using highly realistic 3D models and environments. With modern game engines and hardware available on the market, these VR applications can provide a highly realistic and immersive experience for a user. Different experiments performed by real users in this study prove that utilizing VR technology for different traffic related experiments generated much more favorable results than the traditional displays. Moreover, using AR technologies for pedestrian studies is a novel approach that allows a user to walk in the real world and the simulation world at a one-to-one scale. This capability opens a whole new avenue of user experiment possibilities. On top of that, the in-environment communication chat system will allow researchers to perform different Advanced Driver Assistance System (ADAS) studies without ever needing to leave the simulation environment. Last but not least, the distributed nature of the framework enables users to participate from different geographic locations with their choice of display device (desktop, smartphone, VR, or AR). The prototype developed for this dissertation is readily available on a test webpage, and a user can easily download the prototype application without needing to install anything. The user also can run the remote MTS server and then connect their client application to the server

    On the History and Prospects of Three-Dimensional Human-Computer Interfaces for the provision of Air Traffic Control Services

    Get PDF
    This paper is an essay on the history and prospects of three-dimensional (3D) human- computer interfaces for the provision of air traffic control services. Over the past twenty-five years, many empirical studies have addressed this topic. However, the results have been deemed incoherent and self-contradictory and no common conclusion has been reached. To escape from the deadlock of the experimental approach, this study takes a step back into the conceptual development of 3D interfaces, addressing the fundamental benefits and drawbacks of 3D rendering. Under this light, many results in the literature start to make sense and some conclusions can be drawn. Also, with an emphasis on the future of air traffic control, this research identifies a set of tasks wherein the intrinsic weaknesses of 3D rendering can be minimized and its advantages can be exploited. These are the ones that do not require accurate estimates of distances or angles. For future developments in the field of 3D interfaces for air traffic control operators, we suggest focusing on those tasks only

    The Design of an Electro-Mechanical Bicycle for an Immersive Virtual Environment

    Get PDF
    Roughly 50,000 people are injured in bicycle collisions with motor vehicles each year, approximately 6,000 of these injuries involve children less than 14 years old. To better understand which factors put bicycling children at risk for motor vehicle collisions, researchers at the University of Iowa built a virtual environment that simulates the experience of riding through a town and crossing roads with motor vehicles traffic. The stationary bicycle, the focus of this report, replicates the pedal forces experienced by a rider. The stationary bike also provides the simulator with the bicycle’s velocity and steering angle. This report describes the design of the system, which features a flywheel designed to represent the rider and bike inertia, the mechanical linkages between the rider and an electric motor, and a system to measure steering angles. The bicycle has been built and tested and is currently in use in the virtual environment

    Visualizing bacteria-carrying particles in the operating room: exposing invisible risks

    Get PDF
    Surgical site infections occur due to contamination of the wound area by bacteria-carrying particles during the surgery. There are many surgery preparation conditions that might block the path of clean air in the operating room, consequently increasing the contamination level at the surgical zone. The main goal of the current study is to translate this knowledge into a perceivable tool for the medical staff by applying state-of-the-art simulation and visualization techniques. In this work, the results of numerical simulations are used to inform visualization. These results predict the airflow fields in the operating rooms equipped with mixing, laminar airflow and temperature-controlled airflow ventilation systems. In this regard, the visualization uses a virtual reality interface to translate the computational fluid dynamics simulations into usable animations. The results of this study help the surgical and technical staff to update their procedures by using the provided virtual tools.publishedVersio

    5G Visualization: The METIS-II Project Approach

    Full text link
    [EN] One of the main objectives of the METIS-II project was to enable 5G concepts to reach and convince a wide audience from technology experts to decision makers from non-ICT industries. To achieve this objective, it was necessary to provide easy-to-understand and insightful visualization of 5G. This paper presents the visualization platform developed in the METIS-II project as a joint work of researchers and artists, which is a 3D visualization tool that allows viewers to interact with 5G-enabled scenarios, while permitting simulation driven data to be intuitively evaluated. The platform is a game-based customizable tool that allows a rapid integration of new concepts, allows real-time interaction with remote 5G simulators, and provides a virtual reality-based immersive user experience. As a result, the METIS-II visualization platform has successfully contributed to the dissemination of 5G in different fora and its use will be continued after METIS-II.This work has been performed in the framework of the H2020/5G-PPP project METIS-II cofunded by the EU. The authors wish to thank the rest of METIS-II colleagues who contributed to the development of the METIS-II visualization platform.Martín-Sacristán, D.; Herranz Claveras, C.; Monserrat Del Río, JF.; Szczygiel, A.; Kuruvatti, NP.; Garcia-Roger, D.; Prado-Alvarez, D.... (2018). 5G Visualization: The METIS-II Project Approach. Mobile Information Systems. 1-8. https://doi.org/10.1155/2018/2084950S18Zyda, M. (2005). From visual simulation to virtual reality to games. Computer, 38(9), 25-32. doi:10.1109/mc.2005.297Johnson, C. (2004). Top scientific visualization research problems. IEEE Computer Graphics and Applications, 24(4), 13-17. doi:10.1109/mcg.2004.20Tullberg, H., Popovski, P., Li, Z., Uusitalo, M. A., Hoglund, A., Bulakci, O., … Monserrat, J. F. (2016). The METIS 5G System Concept: Meeting the 5G Requirements. IEEE Communications Magazine, 54(12), 132-139. doi:10.1109/mcom.2016.1500799cmLee, B., Riche, N. H., Isenberg, P., & Carpendale, S. (2015). More Than Telling a Story: Transforming Data into Visually Shared Stories. IEEE Computer Graphics and Applications, 35(5), 84-90. doi:10.1109/mcg.2015.99Yi, J. S., Kang, Y. ah, & Stasko, J. (2007). Toward a Deeper Understanding of the Role of Interaction in Information Visualization. IEEE Transactions on Visualization and Computer Graphics, 13(6), 1224-1231. doi:10.1109/tvcg.2007.70515Campbell, B. D. (2016). Immersive Visualization to Support Scientific Insight. IEEE Computer Graphics and Applications, 36(3), 17-21. doi:10.1109/mcg.2016.6

    Classic Driver VR

    Get PDF
    A VR car-driving simulator for evaluating the user experience of new drivers by helping them to learn driving rules and regulations. The Classic VR Driver helps new drivers to learn driving rules and regulations using various audio and visual feedback. The simulator helps them to get acquainted with the risks and mistakes associated with real life driving. In addition, the users have to play the game in an immersive environment using a Virtual Reality system. This project attempts to fulfill two important goals. The major goal is to evaluate whether the user can learn driving rules and regulations of the road. The game allows the users to take a road test. The road test determines the type of mistakes the user makes and it also determines if they passed or failed in it. I have conducted A/B testing and let the testers participate in user-interviews and user-survey. The testing procedure allowed me to analyze the effectiveness of learning driving rules from the simulator as compared to learning rules from the RMV (Registry of Motor Vehicles) manual. Secondly, the user experience was evaluated by allowing users to participate in user-interviews and user-surveys. It helped me to understand the positives and drawbacks of the game. These feedback are taken into consideration for future improvement. All these factors were considered to make the game as enjoyable and useful in terms of skill training
    • …
    corecore