5 research outputs found

    Inattentional Blindness for Redirected Walking Using Dynamic Foveated Rendering

    Get PDF
    Redirected walking is a Virtual Reality(VR) locomotion technique which enables users to navigate virtual environments (VEs) that are spatially larger than the available physical tracked space. In this work we present a novel technique for redirected walking in VR based on the psychological phenomenon of inattentional blindness. Based on the user's visual fixation points we divide the user's view into zones. Spatially-varying rotations are applied according to the zone's importance and are rendered using foveated rendering. Our technique is real-time and applicable to small and large physical spaces. Furthermore, the proposed technique does not require the use of stimulated saccades but rather takes advantage of naturally occurring saccades and blinks for a complete refresh of the framebuffer. We performed extensive testing and present the analysis of the results of three user studies conducted for the evaluation

    Inattentional Blindness for Redirected Walking Using Dynamic Foveated Rendering

    Get PDF
    Redirected walking is a Virtual Reality(VR) locomotion technique which enables users to navigate virtual environments (VEs) that are spatially larger than the available physical tracked space. In this work we present a novel technique for redirected walking in VR based on the psychological phenomenon of inattentional blindness. Based on the user's visual fixation points we divide the user's view into zones. Spatially-varying rotations are applied according to the zone's importance and are rendered using foveated rendering. Our technique is real-time and applicable to small and large physical spaces. Furthermore, the proposed technique does not require the use of stimulated saccades but rather takes advantage of naturally occurring saccades and blinks for a complete refresh of the framebuffer. We performed extensive testing and present the analysis of the results of three user studies conducted for the evaluation

    Eye&Head:Synergetic Eye and Head Movement for Gaze Pointing and Selection

    Get PDF
    Eye gaze involves the coordination of eye and head movement to acquire gaze targets, but existing approaches to gaze pointing are based on eye-tracking in abstraction from head motion. We propose to leverage the synergetic movement of eye and head, and identify design principles for Eye&Head gaze interaction. We introduce three novel techniques that build on the distinction of head-supported versus eyes-only gaze, to enable dynamic coupling of gaze and pointer, hover interaction, visual exploration around pre-selections, and iterative and fast confirmation of targets. We demonstrate Eye&Head interaction on applications in virtual reality, and evaluate our techniques against baselines in pointing and confirmation studies. Our results show that Eye&Head techniques enable novel gaze behaviours that provide users with more control and flexibility in fast gaze pointing and selection

    Towards Understanding and Expanding Locomotion in Physical and Virtual Realities

    Get PDF
    Among many virtual reality interactions, the locomotion dilemma remains a significant impediment to achieving an ideal immersive experience. The physical limitations of tracked space make it impossible to naturally explore theoretically boundless virtual environments with a one-to-one mapping. Synthetic techniques like teleportation and flying often induce simulator sickness and break the sense of presence. Therefore, natural walking is the most favored form of locomotion. Redirected walking offers a more natural and intuitive way for users to navigate vast virtual spaces efficiently. However, existing techniques either lead to simulator sickness due to visual and vestibular mismatch or detract users from the immersive experience that virtual reality aims to provide. This research presents innovative techniques and applications to enhance the user experience by expanding walkable, physical space in Virtual Reality. The thesis includes three main contributions. The first contribution proposes a mobile application that uses markerless Augmented Reality to allow users to explore a life-sized virtual library through a divide-and-rule approach. The second contribution presents a subtle redirected walking technique based on inattentional blindness, using dynamic foveated rendering and natural visual suppressions like blinks and saccades. Finally, the third contribution introduces a novel redirected walking solution that leverages a deep neural network, to predict saccades in real-time and eliminate the hardware requirements for eye-tracking. Overall, this thesis offers valuable contributions to human-computer interaction, investigating novel approaches to solving the locomotion dilemma. The proposed solutions were evaluated through extensive user studies, demonstrating their effectiveness and applicability in real-world scenarios like training simulations and entertainment

    Eye tracking for locomotion prediction in redirected walking

    No full text
    corecore