25 research outputs found

    Modulating the performance of VR navigation tasks using different methods of presenting visual information

    Get PDF
    Spatial navigation is an essential ability in our daily lives that we use to move through different locations. In Virtual Reality (VR), the environments that users navigate may be large and similar to real world places. It is usually desirable to guide users in order to prevent them from getting lost and to make it easier for them to reach the goal or discover important spots in the environment. However, doing so in a way that the guidance is not intrusive, breaking the immersion and sense of presence, nor too hard to notice, therefore not being useful, can be a challenge. In this work we conducted an experiment in which we adapted a probabilistic learning paradigm: the Weather Prediction task to spatial navigation in VR. Subjects navigated one of the two versions of procedurally generated T-junction mazes in Virtual Reality. In one version, the environment contained visual cues in the form of street signs whose presence predicted the correct turning direction. In the other version the cues were present, but were not predictive. Results showed that when subjects navigated the mazes with the predictive cues they made less mistakes, and therefore the cues helped them navigate the environments. A comparison with previous Neuroscience literature revealed that the strategies used by subjects to solve the task were different than in the original 2D experiment. This work is intended to be used as a basis to further improve spatial navigation in VR with more immersive and implicit methods, and as another example of how the Cognitive Neurosicence and Virtual Reality research fields can greatly benefit each other

    Navigating Immersive and Interactive VR Environments With Connected 360° Panoramas

    Get PDF
    Emerging research is expanding the idea of using 360-degree spherical panoramas of real-world environments for use in 360 VR experiences beyond video and image viewing. However, most of these experiences are strictly guided, with few opportunities for interaction or exploration. There is a desire to develop experiences with cohesive virtual environments created with 360 VR that allow for choice in navigation, versus scripted experiences with limited interaction. Unlike standard VR with the freedom of synthetic graphics, there are challenges in designing appropriate user interfaces (UIs) for 360 VR navigation within the limitations of fixed assets. To tackle this gap, we designed RealNodes, a software system that presents an interactive and explorable 360 VR environment. We also developed four visual guidance UIs for 360 VR navigation. The results of a pilot study showed that choice of UI had a significant effect on task completion times, showing one of our methods, Arrow, was best. Arrow also exhibited positive but non-significant trends in average measures with preference, user engagement, and simulator-sickness. RealNodes, the UI designs, and the pilot study results contribute preliminary information that inspire future investigation of how to design effective explorable scenarios in 360 VR and visual guidance metaphors for navigation in applications using 360 VR environments

    LoCoMoTe – a framework for classification of natural locomotion in VR by task, technique and modality

    Get PDF
    Virtual reality (VR) research has provided overviews of locomotion techniques, how they work, their strengths and overall user experience. Considerable research has investigated new methodologies, particularly machine learning to develop redirection algorithms. To best support the development of redirection algorithms through machine learning, we must understand how best to replicate human navigation and behaviour in VR, which can be supported by the accumulation of results produced through live-user experiments. However, it can be difficult to identify, select and compare relevant research without a pre-existing framework in an ever-growing research field. Therefore, this work aimed to facilitate the ongoing structuring and comparison of the VR-based natural walking literature by providing a standardised framework for researchers to utilise. We applied thematic analysis to study methodology descriptions from 140 VR-based papers that contained live-user experiments. From this analysis, we developed the LoCoMoTe framework with three themes: navigational decisions, technique implementation, and modalities. The LoCoMoTe framework provides a standardised approach to structuring and comparing experimental conditions. The framework should be continually updated to categorise and systematise knowledge and aid in identifying research gaps and discussions

    Multimodality in VR: A survey

    Get PDF
    Virtual reality (VR) is rapidly growing, with the potential to change the way we create and consume content. In VR, users integrate multimodal sensory information they receive, to create a unified perception of the virtual world. In this survey, we review the body of work addressing multimodality in VR, and its role and benefits in user experience, together with different applications that leverage multimodality in many disciplines. These works thus encompass several fields of research, and demonstrate that multimodality plays a fundamental role in VR; enhancing the experience, improving overall performance, and yielding unprecedented abilities in skill and knowledge transfer

    Advancing proxy-based haptic feedback in virtual reality

    Get PDF
    This thesis advances haptic feedback for Virtual Reality (VR). Our work is guided by Sutherland's 1965 vision of the ultimate display, which calls for VR systems to control the existence of matter. To push towards this vision, we build upon proxy-based haptic feedback, a technique characterized by the use of passive tangible props. The goal of this thesis is to tackle the central drawback of this approach, namely, its inflexibility, which yet hinders it to fulfill the vision of the ultimate display. Guided by four research questions, we first showcase the applicability of proxy-based VR haptics by employing the technique for data exploration. We then extend the VR system's control over users' haptic impressions in three steps. First, we contribute the class of Dynamic Passive Haptic Feedback (DPHF) alongside two novel concepts for conveying kinesthetic properties, like virtual weight and shape, through weight-shifting and drag-changing proxies. Conceptually orthogonal to this, we study how visual-haptic illusions can be leveraged to unnoticeably redirect the user's hand when reaching towards props. Here, we contribute a novel perception-inspired algorithm for Body Warping-based Hand Redirection (HR), an open-source framework for HR, and psychophysical insights. The thesis concludes by proving that the combination of DPHF and HR can outperform the individual techniques in terms of the achievable flexibility of the proxy-based haptic feedback.Diese Arbeit widmet sich haptischem Feedback für Virtual Reality (VR) und ist inspiriert von Sutherlands Vision des ultimativen Displays, welche VR-Systemen die Fähigkeit zuschreibt, Materie kontrollieren zu können. Um dieser Vision näher zu kommen, baut die Arbeit auf dem Konzept proxy-basierter Haptik auf, bei der haptische Eindrücke durch anfassbare Requisiten vermittelt werden. Ziel ist es, diesem Ansatz die für die Realisierung eines ultimativen Displays nötige Flexibilität zu verleihen. Dazu bearbeiten wir vier Forschungsfragen und zeigen zunächst die Anwendbarkeit proxy-basierter Haptik durch den Einsatz der Technik zur Datenexploration. Anschließend untersuchen wir in drei Schritten, wie VR-Systeme mehr Kontrolle über haptische Eindrücke von Nutzern erhalten können. Hierzu stellen wir Dynamic Passive Haptic Feedback (DPHF) vor, sowie zwei Verfahren, die kinästhetische Eindrücke wie virtuelles Gewicht und Form durch Gewichtsverlagerung und Veränderung des Luftwiderstandes von Requisiten vermitteln. Zusätzlich untersuchen wir, wie visuell-haptische Illusionen die Hand des Nutzers beim Greifen nach Requisiten unbemerkt umlenken können. Dabei stellen wir einen neuen Algorithmus zur Body Warping-based Hand Redirection (HR), ein Open-Source-Framework, sowie psychophysische Erkenntnisse vor. Abschließend zeigen wir, dass die Kombination von DPHF und HR proxy-basierte Haptik noch flexibler machen kann, als es die einzelnen Techniken alleine können

    Virtually walking : factors influencing walking and perception of walking in treadmill-mediated virtual reality to support rehabilitation

    Get PDF
    Psychomotor slowing, and in particular slow walking, is a common correlate of illness or injury, and often persists long after the precipitating condition has improved. Since slow walking has implications for long term physical and social wellbeing, it is important to find ways to address this issue. However, whilst it is well established that exercise programmes are good approaches to increase movement speed, adherence to therapy remains poor. The main reasons for this appear to be pain and lack of interest and enjoyment in the exercise. Virtual Rehabilitation combines physical therapy with Virtual Reality (VR). This is a rapidly growing area of health care, which seems to offer a potential solution to these issues, by offering the benefits of increased patient engagement and decreased perception of pain. However, the question of how to encourage patients to increase their walking speed whilst interacting with VR has remained unanswered. Moreover, to maximise the benefits of this type of therapy, there needs to be a greater understanding of how different factors in treadmill-mediated VR can facilitate (or hinder) optimal walking. Therefore this thesis investigated the factors influencing walking and perception of walking in treadmill-mediated VR, using a series of empirical investigations to determine the effect of a variety of factors in VR, which can then be applied in a clinical setting. A review of the literature identified that high contrast stereoscopic virtual environments, calibrated to real-world dimensions, with a wide field of view and peripheral visual cues, are likely to facilitate accurate self-motion perception. Empirical studies demonstrated that decreasing the visual gain (ratio of optic flow to walk speed) in VR can lead to a sustained increase in walk speed. However, these lower rates of visual gain are likely to be perceived as unrealistic, and may decrease immersion. Further investigation demonstrated that there is a range of visual gain which is perceived as acceptably normal, although even the lower bound of this acceptable gain is still higher than the optimum gain for facilitating faster movements. Thus there is a trade-off between visual gain for realistic perception, and visual gain for improved walking speeds. Therefore other components that can improve walking speed need to be identified, particularly for those applications where reduction of the visual gain is undesirable. Further empirical studies demonstrated that fast audio cues (125% of baseline cadence), in the form of a footstep sound, can increase the walk speed without disrupting the natural walk ratio. This effect was demonstrated in healthy populations, and also shown to be evident in a group of patients with chronic musculoskeletal pain. It was noted that in all the studies comparing a pain and non-pain group, the pain group walked more slowly across all conditions. Additional empirical studies demonstrated that the use of self-paced treadmills for interfacing with VR was found to be associated with somewhat lower baseline walk speeds than normal overground walking, although the self-paced treadmills preserved the normal walk ratio. This slowing of walking and preservation of walk ratio was seen in both healthy participants and also in participants with chronic musculoskeletal pain. Therefore, whilst self-paced treadmills can support natural walking, additional factors need to be considered if treadmill-mediated VR is to be used to facilitate the increase in walking speeds desirable for rehabilitation. Thus designing VR for rehabilitation is likely to involve consideration of a number of factors, and making individualised design decision based on specific therapeutic goals.EThOS - Electronic Theses Online ServiceGBUnited Kingdo

    Sonic Interactions in Virtual Environments

    Get PDF
    This open access book tackles the design of 3D spatial interactions in an audio-centered and audio-first perspective, providing the fundamental notions related to the creation and evaluation of immersive sonic experiences. The key elements that enhance the sensation of place in a virtual environment (VE) are: Immersive audio: the computational aspects of the acoustical-space properties of Virutal Reality (VR) technologies Sonic interaction: the human-computer interplay through auditory feedback in VE VR systems: naturally support multimodal integration, impacting different application domains Sonic Interactions in Virtual Environments will feature state-of-the-art research on real-time auralization, sonic interaction design in VR, quality of the experience in multimodal scenarios, and applications. Contributors and editors include interdisciplinary experts from the fields of computer science, engineering, acoustics, psychology, design, humanities, and beyond. Their mission is to shape an emerging new field of study at the intersection of sonic interaction design and immersive media, embracing an archipelago of existing research spread in different audio communities and to increase among the VR communities, researchers, and practitioners, the awareness of the importance of sonic elements when designing immersive environments

    Sonic interactions in virtual environments

    Get PDF
    This book tackles the design of 3D spatial interactions in an audio-centered and audio-first perspective, providing the fundamental notions related to the creation and evaluation of immersive sonic experiences. The key elements that enhance the sensation of place in a virtual environment (VE) are: Immersive audio: the computational aspects of the acoustical-space properties of Virutal Reality (VR) technologies Sonic interaction: the human-computer interplay through auditory feedback in VE VR systems: naturally support multimodal integration, impacting different application domains Sonic Interactions in Virtual Environments will feature state-of-the-art research on real-time auralization, sonic interaction design in VR, quality of the experience in multimodal scenarios, and applications. Contributors and editors include interdisciplinary experts from the fields of computer science, engineering, acoustics, psychology, design, humanities, and beyond. Their mission is to shape an emerging new field of study at the intersection of sonic interaction design and immersive media, embracing an archipelago of existing research spread in different audio communities and to increase among the VR communities, researchers, and practitioners, the awareness of the importance of sonic elements when designing immersive environments

    Sonic Interactions in Virtual Environments

    Get PDF
    corecore