1,629 research outputs found

    Convex Interaction : VR o mochiita kōdō asshuku ni yoru kūkanteki intarakushon no kakuchō

    Get PDF

    A motion control method for a differential drive robot based on human walking for immersive telepresence

    Get PDF
    Abstract. This thesis introduces an interface for controlling Differential Drive Robots (DDRs) for telepresence applications. Our goal is to enhance immersive experience while reducing user discomfort, when using Head Mounted Displays (HMDs) and body trackers. The robot is equipped with a 360° camera that captures the Robot Environment (RE). Users wear an HMD and use body trackers to navigate within a Local Environment (LE). Through a live video stream from the robot-mounted camera, users perceive the RE within a virtual sphere known as the Virtual Environment (VE). A proportional controller was employed to facilitate the control of the robot, enabling to replicate the movements of the user. The proposed method uses chest tracker to control the telepresence robot and focuses on minimizing vection and rotations induced by the robot’s motion by modifying the VE, such as rotating and translating it. Experimental results demonstrate the accuracy of the robot in reaching target positions when controlled through the body-tracker interface. Additionally, it also reveals an optimal VE size that effectively reduces VR sickness and enhances the sense of presence

    Virtual reality and body rotation: 2 flight experiences in comparison

    Get PDF
    Embodied interfaces, represented by devices that incorporate bodily motion and proprioceptive stimulation, are promising for Virtual Reality (VR) because they can improve immersion and user experience while at the same time reducing simulator sickness compared to more traditional handheld interfaces (e.g.,gamepads). The aim of the study is to evaluate a novel embodied interface called VitruvianVR. The machine is composed of two separate rings that allow its users to bodily rotate onto three different axes. The suitability of the VitruvianVR was tested in a Virtual Reality flight scenario. In order to reach the goal we compared the VitruvianVR to a gamepad using perfomance measures (i.e., accuracy, fails), head movements and position of the body. Furthermore, a series of data coming from questionnaires about sense of presence, user experience, cognitive load, usability and cybersickness was retrieved.Embodied interfaces, represented by devices that incorporate bodily motion and proprioceptive stimulation, are promising for Virtual Reality (VR) because they can improve immersion and user experience while at the same time reducing simulator sickness compared to more traditional handheld interfaces (e.g.,gamepads). The aim of the study is to evaluate a novel embodied interface called VitruvianVR. The machine is composed of two separate rings that allow its users to bodily rotate onto three different axes. The suitability of the VitruvianVR was tested in a Virtual Reality flight scenario. In order to reach the goal we compared the VitruvianVR to a gamepad using perfomance measures (i.e., accuracy, fails), head movements and position of the body. Furthermore, a series of data coming from questionnaires about sense of presence, user experience, cognitive load, usability and cybersickness was retrieved

    An Overview of Self-Adaptive Technologies Within Virtual Reality Training

    Get PDF
    This overview presents the current state-of-the-art of self-adaptive technologies within virtual reality (VR) training. Virtual reality training and assessment is increasingly used for five key areas: medical, industrial & commercial training, serious games, rehabilitation and remote training such as Massive Open Online Courses (MOOCs). Adaptation can be applied to five core technologies of VR including haptic devices, stereo graphics, adaptive content, assessment and autonomous agents. Automation of VR training can contribute to automation of actual procedures including remote and robotic assisted surgery which reduces injury and improves accuracy of the procedure. Automated haptic interaction can enable tele-presence and virtual artefact tactile interaction from either remote or simulated environments. Automation, machine learning and data driven features play an important role in providing trainee-specific individual adaptive training content. Data from trainee assessment can form an input to autonomous systems for customised training and automated difficulty levels to match individual requirements. Self-adaptive technology has been developed previously within individual technologies of VR training. One of the conclusions of this research is that while it does not exist, an enhanced portable framework is needed and it would be beneficial to combine automation of core technologies, producing a reusable automation framework for VR training

    Методи пом'якшення

    Get PDF
    The section describes mitigation methods.У розділі описано методи помякшення

    Software techniques for improving head mounted displays to create comfortable user experiences in virtual reality

    Get PDF
    Head Mounted Displays (HMDs) allow users to experience Virtual Reality (VR) with a great level of immersion. Advancements in hardware technologies have led to a reduction in cost of producing good quality VR HMDs bringing them out from research labs to consumer markets. However, the current generation of HMDs suffer from a few fundamental problems that can deter their widespread adoption. For this thesis, we explored two techniques to overcome some of the challenges of experiencing VR when using HMDs. When experiencing VR with HMDs strapped to your head, even simple physical tasks like drinking a beverage can be difficult and awkward. We explored mixed reality renderings that selectively incorporate the physical world into the virtual world for interactions with physical objects. We conducted a user study comparing four rendering techniques that balance immersion in the virtual world with ease of interaction with the physical world. Users of VR systems often experience vection, the perception of self-motion in the absence of any physical movement. While vection helps to improve presence in VR, it often leads to a form of motion sickness called cybersickness. Prior work has discovered that changing vection (changing the perceived speed or moving direction) causes more severe cybersickness than steady vection (walking at a constant speed or in a constant direction). Based on this idea, we tried to reduce cybersickness caused by character movements in a First Person Shooter (FPS) game in VR. We propose Rotation Blurring (RB), uniformly blurring the screen during rotational movements to reduce cybersickness. We performed a user study to evaluate the impact of RB in reducing cybersickness and found that RB led to an overall reduction in sickness levels of the participants and delayed its onset. Participants who experienced acute levels of cybersickness benefited significantly from this technique

    Measuring user experience for virtual reality

    Get PDF
    In recent years, Virtual Reality (VR) and 3D User Interfaces (3DUI) have seen a drastic increase in popularity, especially in terms of consumer-ready hardware and software. These technologies have the potential to create new experiences that combine the advantages of reality and virtuality. While the technology for input as well as output devices is market ready, only a few solutions for everyday VR - online shopping, games, or movies - exist, and empirical knowledge about performance and user preferences is lacking. All this makes the development and design of human-centered user interfaces for VR a great challenge. This thesis investigates the evaluation and design of interactive VR experiences. We introduce the Virtual Reality User Experience (VRUX) model based on VR-specific external factors and evaluation metrics such as task performance and user preference. Based on our novel UX evaluation approach, we contribute by exploring the following directions: shopping in virtual environments, as well as text entry and menu control in the context of everyday VR. Along with this, we summarize our findings by design spaces and guidelines for choosing optimal interfaces and controls in VR.In den letzten Jahren haben Virtual Reality (VR) und 3D User Interfaces (3DUI) stark an Popularität gewonnen, insbesondere bei Hard- und Software im Konsumerbereich. Diese Technologien haben das Potenzial, neue Erfahrungen zu schaffen, die die Vorteile von Realität und Virtualität kombinieren. Während die Technologie sowohl für Eingabe- als auch für Ausgabegeräte marktreif ist, existieren nur wenige Lösungen für den Alltag in VR - wie Online-Shopping, Spiele oder Filme - und es fehlt an empirischem Wissen über Leistung und Benutzerpräferenzen. Dies macht die Entwicklung und Gestaltung von benutzerzentrierten Benutzeroberflächen für VR zu einer großen Herausforderung. Diese Arbeit beschäftigt sich mit der Evaluation und Gestaltung von interaktiven VR-Erfahrungen. Es wird das Virtual Reality User Experience (VRUX)- Modell eingeführt, das auf VR-spezifischen externen Faktoren und Bewertungskennzahlen wie Leistung und Benutzerpräferenz basiert. Basierend auf unserem neuartigen UX-Evaluierungsansatz leisten wir einen Beitrag, indem wir folgende interaktive Anwendungsbereiche untersuchen: Einkaufen in virtuellen Umgebungen sowie Texteingabe und Menüsteuerung im Kontext des täglichen VR. Die Ergebnisse werden außerdem mittels Richtlinien zur Auswahl optimaler Schnittstellen in VR zusammengefasst

    Towards Understanding and Expanding Locomotion in Physical and Virtual Realities

    Get PDF
    Among many virtual reality interactions, the locomotion dilemma remains a significant impediment to achieving an ideal immersive experience. The physical limitations of tracked space make it impossible to naturally explore theoretically boundless virtual environments with a one-to-one mapping. Synthetic techniques like teleportation and flying often induce simulator sickness and break the sense of presence. Therefore, natural walking is the most favored form of locomotion. Redirected walking offers a more natural and intuitive way for users to navigate vast virtual spaces efficiently. However, existing techniques either lead to simulator sickness due to visual and vestibular mismatch or detract users from the immersive experience that virtual reality aims to provide. This research presents innovative techniques and applications to enhance the user experience by expanding walkable, physical space in Virtual Reality. The thesis includes three main contributions. The first contribution proposes a mobile application that uses markerless Augmented Reality to allow users to explore a life-sized virtual library through a divide-and-rule approach. The second contribution presents a subtle redirected walking technique based on inattentional blindness, using dynamic foveated rendering and natural visual suppressions like blinks and saccades. Finally, the third contribution introduces a novel redirected walking solution that leverages a deep neural network, to predict saccades in real-time and eliminate the hardware requirements for eye-tracking. Overall, this thesis offers valuable contributions to human-computer interaction, investigating novel approaches to solving the locomotion dilemma. The proposed solutions were evaluated through extensive user studies, demonstrating their effectiveness and applicability in real-world scenarios like training simulations and entertainment
    corecore