5 research outputs found

    Augmented Reality for Massive Particle Distribution

    Get PDF
    Understanding the behavior of aerosol particles remains a key concern especially during the current corona pandemic times. In this paper, we present a method for visualizing the distribution of aerosol particles in augmented reality (AR) using the Microsoft Hololens device. We use this technology to obtain better spatial perception of particles in the real world which are invisible to the naked eye. As a case study, we show the flow field of exhaled aerosols with and without wearing a mask. To do this, we first measure the particle flow under laboratory conditions. Then we trace a certain amount of exhaled particles. Using the particle system component of the Unity game engine, our AR application also takes each particle's 3D position into consideration. Furthermore, 3 different particle visualization approaches are evaluated to develop the ability to visualize the maximum number of particles on Microsoft HoloLens without compromising on visual quality. Finally, we were able to show virtual particles in the real world. Without mask they propagate forward and with mask they ascend. With an optimized implementation, we achieved a simultaneous display of nearly 80,000 moving particles at an average rate of 35 frames per second

    iUSIM - Institutsübergreifende, modulare Urban Mobility Simulationsinfrastruktur

    Get PDF
    Generische Simulationsmodelle sind essenziell für die Bewertung neuer Konzepte, Verfahren und Methoden im Bereich Luftfahrt und Verkehr und stellen somit einen wesentlichen Bestandteil unserer Forschung dar. Die Institute für Flugführung (FL), Verkehrssystemtechnik (TS) und Softwaretechnologie (SC) verfügen jeweils über fundierte Kenntnisse und Expertisen hinsichtlich verschiedener Simulationstechnologien und -möglichkeiten. Diese wurden im Rahmen des Projektes zusammengetragen und ausgetauscht. Anhand eines Rettungsszenarios steht insbesondere die Verbindung der einzelnen Simulatoren von FL, TS, SC im Fokus dieser Arbeit

    Evaluation of Interaction Techniques for Early Phase Satellite Design in Immersive Augmented Reality

    No full text
    The development of a satellite is a complex endeavor, requiring expertise in numerous different domains, such as mathematics, physics and engineering. Scientists and engineers are facing the challenge of explaining their special knowledge to experts of other subject areas. In order to communicate the specific constraints of their respective subsystem appropriately, creative communication methods, such as 3D visualizations are necessary. Due to communicative misinterpretation of the requirements a time-consuming process to the construction of a satellite arises. Recent display technologies developed for immersive augmented reality (AR) offer new ways of presenting and interacting with computer-generated content. The aim is to ensure a fast and intuitive interaction with the virtual objects in order to enable a quick visualization of the placed objects. The integrated hand gesture technique is compared with an external controller-based interaction technique by using an optical see-through head-mounted display. For this purpose a virtual satellite with corresponding manipulable components is being designed and the interaction controls are being implemented. Finally, a user study is conducted to evaluate the usability and difference of the developed interaction techniques. Advantages and disadvantages of manipulation methods concerning the satellite assembly task in AR, as well as potentials for the enhancement of the proposed user’s interface, are revealed

    Evaluation of Interaction Techniques for Early Phase Satellite Design in Immersive AR

    No full text
    In this paper, we present a new controller-based interaction technique on the Microsoft HoloLens to support communication for the early phase satellite design at the Concurrent Engineering Facility (CEF). We design a virtual satellite with virtual moveable objects utilizing two different interaction methods: the default hand gesture-based interaction method and a novel controller-based interaction method for rotation and translation of satellite components in immersive augmented reality. In order to evaluate our method, we conduct a perceptual study with 12 participants. We apply multiple performance metrics for each user on both methods. Additionally, we measure user preferences and ease of use. Our results show that our controller-based method is significantly more precise for placing objects (consisting of position and orientation). Furthermore, it is less time consuming than the hand gesture-based method and more preferred by the participants
    corecore