846 research outputs found

    Real walking in virtual environments for factory planning and evaluation

    Get PDF
    Nowadays, buildings or production facilities are designed using specialized design software and building information modeling tools help to evaluate the resulting virtual mock-up. However, with current, primarily desktop based tools it is hard to evaluate human factors of such a design, for instance spatial constraints for workforces. This paper presents a new tool for factory planning and evaluation based on virtual reality that allows designers, planning experts, and workforces to walk naturally and freely within a virtual factory. Therefore, designs can be checked as if they were real before anything is built.ISSN:2212-827

    Measuring user experience for virtual reality

    Get PDF
    In recent years, Virtual Reality (VR) and 3D User Interfaces (3DUI) have seen a drastic increase in popularity, especially in terms of consumer-ready hardware and software. These technologies have the potential to create new experiences that combine the advantages of reality and virtuality. While the technology for input as well as output devices is market ready, only a few solutions for everyday VR - online shopping, games, or movies - exist, and empirical knowledge about performance and user preferences is lacking. All this makes the development and design of human-centered user interfaces for VR a great challenge. This thesis investigates the evaluation and design of interactive VR experiences. We introduce the Virtual Reality User Experience (VRUX) model based on VR-specific external factors and evaluation metrics such as task performance and user preference. Based on our novel UX evaluation approach, we contribute by exploring the following directions: shopping in virtual environments, as well as text entry and menu control in the context of everyday VR. Along with this, we summarize our findings by design spaces and guidelines for choosing optimal interfaces and controls in VR.In den letzten Jahren haben Virtual Reality (VR) und 3D User Interfaces (3DUI) stark an Popularität gewonnen, insbesondere bei Hard- und Software im Konsumerbereich. Diese Technologien haben das Potenzial, neue Erfahrungen zu schaffen, die die Vorteile von Realität und Virtualität kombinieren. Während die Technologie sowohl für Eingabe- als auch für Ausgabegeräte marktreif ist, existieren nur wenige Lösungen für den Alltag in VR - wie Online-Shopping, Spiele oder Filme - und es fehlt an empirischem Wissen über Leistung und Benutzerpräferenzen. Dies macht die Entwicklung und Gestaltung von benutzerzentrierten Benutzeroberflächen für VR zu einer großen Herausforderung. Diese Arbeit beschäftigt sich mit der Evaluation und Gestaltung von interaktiven VR-Erfahrungen. Es wird das Virtual Reality User Experience (VRUX)- Modell eingeführt, das auf VR-spezifischen externen Faktoren und Bewertungskennzahlen wie Leistung und Benutzerpräferenz basiert. Basierend auf unserem neuartigen UX-Evaluierungsansatz leisten wir einen Beitrag, indem wir folgende interaktive Anwendungsbereiche untersuchen: Einkaufen in virtuellen Umgebungen sowie Texteingabe und Menüsteuerung im Kontext des täglichen VR. Die Ergebnisse werden außerdem mittels Richtlinien zur Auswahl optimaler Schnittstellen in VR zusammengefasst

    3D interaction with scientific data : an experimental and perceptual approach

    Get PDF

    Natural locomotion based on a reduced set of inertial sensors: decoupling body and head directions indoors

    Get PDF
    Inertial sensors offer the potential for integration into wireless virtual reality systems that allow the users to walk freely through virtual environments. However, owing to drift errors, inertial sensors cannot accurately estimate head and body orientations in the long run, and when walking indoors, this error cannot be corrected by magnetometers, due to the magnetic field distortion created by ferromagnetic materials present in buildings. This paper proposes a technique, called EHBD (Equalization of Head and Body Directions), to address this problem using two head- and shoulder-located magnetometers. Due to their proximity, their distortions are assumed to be similar and the magnetometer measurements are used to detect when the user is looking straight forward. Then, the system corrects the discrepancies between the estimated directions of the head and the shoulder, which are provided by gyroscopes and consequently are affected by drift errors. An experiment is conducted to evaluate the performance of this technique in two tasks (navigation and navigation plus exploration) and using two different locomotion techniques: (1) gaze-directed mode (GD) in which the walking direction is forced to be the same as the head direction, and (2) decoupled direction mode (DD) in which the walking direction can be different from the viewing direction. The obtained results show that both locomotion modes show similar matching of the target path during the navigation task, while DD’s path matches the target path more closely than GD in the navigation plus exploration task. These results validate the EHBD technique especially when allowing different walking and viewing directions in the navigation plus exploration tasks, as expected. While the proposed method does not reach the accuracy of optical tracking (ideal case), it is an acceptable and satisfactory solution for users and is much more compact, portable and economical
    corecore