148,365 research outputs found

    An experimental comparison of perceived egocentric distance in real, image-based, and traditional virtual environment using direct walking tasks

    Get PDF
    technical reportIn virtual environments, perceived egocentric distances are often underestimated when compared to the same distance judgments in the real world. The research presented in this paper explores two possible causes for this reduced distance perception in virtual environments: (1) real-time computer graphics rendering, and (2) immersive display technology. Our experiment compared egocentric distance judgments in three complex, indoor environments: a real hallway with full-cue conditions; a virtual, stereoscopic, photographic panorama; and a virtual, stereoscopic computer model. Perceived egocentric distance was determined by a directed walking task in which subjects walk blindfolded to the target. Our results show there is a significant difference in distance judgments between real and virtual environments. However, the differences between distance judgments in virtual photographic panorama environments and traditionally rendered virtual environments are small, suggesting that the display device is affecting distance judgments in virtual environments

    Audio, visual, and audio-visual egocentric distance perception in virtual environments.

    No full text
    International audiencePrevious studies have shown that in real environments, distances are visually correctly estimated. In visual (V) virtual environments (VEs), distances are systematically underestimated. In audio (A) real and virtual environments, near distances (2 m) are underestimated. However, little is known regarding combined A and V interactions on the egocentric distance perception in VEs. In this paper we present a study of A, V, and AV egocentric distance perception in VEs. AV rendering is provided via the SMART-I2 platform using tracked passive visual stereoscopy and acoustical wave field synthesis (WFS). Distances are estimated using triangulated blind walking under A, V, and AV conditions. Distance compressions similar to those found in previous studies are observed under each rendering condition. The audio and visual modalities appears to be of similar precision for distance estimations in virtual environments. This casts doubts on the commonly accepted visual capture theory in distance perception

    Distance Perception in Virtual Reality

    Get PDF
    Distance perception in virtual reality is radically different than distance perception in the real world. For example, distances tend to be underperceived in virtual reality, and anisotropic effects of distance perception seen in the real world are exacerbated. Three experiments were conducted examining distance perception in virtual reality. The first two experiments examined the speed of improvement as a result of interaction and the transfer of improvement across different scales of space. The third experiment examined the anisotropic effect of distance orientation on perceived distance within the virtual environment. The first experiment found that five interaction trials resulted in a large improvement in perceived distance, and subsequent interactions also continued improvement, but suffered heavily from diminishing returns. In the second experiment, interaction with near (1-2m) objects improved distance perception for near but not far (4-5m) objects, while interaction with far objects improved distance perception for both near and far objects. In the third experiment, the orientation of the distances between the objects significantly affected distance perception between all but two of the conditions. These results help generate strategies to reduce underperception in virtual reality, and help distinguish between theories of how interacting with environments influences perceived distances

    Depth and Distance Perceptions within Virtual Reality Environments. A Comparison between HMDs and CAVEs in Architectural Design

    Get PDF
    The Perceptions of Depth and Distance are considered as two of the most important factors in Virtual Reality Environments, as these environments inevitability impact the perception of the virtual content compared with the one of the real world. Many studies on depth and distance perceptions in a virtual environment exist. Most of them were conducted using Head-Mounted Displays (HMDs) and less with large screen displays such as those of Cave Automatic Virtual Environments (CAVEs). In this paper, we make a comparison between the different aspects of perception in the architectural environment between CAVE systems and HMD. This paper clarifies the Virtual Object as an entity in a VE and also the pros and cons of using CAVEs and HMDs are explained. Eventually, just a first survey of the planned case study of the artificial port of the Trajan emperor near Fiumicino has been done as for COVID-19 an on-field experimentation could not have been performed

    The Effect of Graphic Quality in Virtual Environments on the Perception of Egocentric and Exocentric Distances

    Get PDF
    Virtual realities (VRs), also known as virtual environments, have been used to simulate physical presence in real environments (i.e., simulations for pilot training) as well as imaginary places (i.e., videogames). Mostly constructed as visual experiences, innovations in VR technologies now include additional sensory information, such as sound and touch, and have allowed for collaborations across diverse fields, including skills training, ergonomics, therapeutic programs, perception and cognitive psychology. Virtual realities in a therapeutic role have been applied to numerous forms of exposure therapy to address phobias such as claustrophobia, agoraphobia, and acrophobia (fear of heights), as well as post-traumatic stress disorder (PTSD) and anxiety disorders. Virtual reality methodology has also been used in physical therapy, occupational therapy, and physical rehabilitation. Moreover, research has been comprehensive in addressing the participant\u27s perceptual reaction to the VR environment and has addressed the effect of the quality of the graphics of the VR environment on judging spatial egocentric distances (i.e., distances between the participant\u27s virtual self and objects in the VR environment) and exocentric distances (i.e., distances between various objects in the VR environment). For example, participants in head-mounted-display-(HMD-)based immersive VR environments consistently underestimated egocentric distances walked to previously viewed targets in both low- and high-quality VR environments compared to estimates done in real-world environments. Interestingly, participants were more accurate in verbally reporting the distances in high-quality VR environments (Kunz et al., 2009). This dissociation between magnitude estimates of target distance and action-based indicators of perceived distance (i.e., walking to previously-viewed objects) will be further explored in the present research by using other kinds of distance estimates and judgments of egocentric distances, as well as exocentric distances. This research has implications in the use of distance perception strategies in the context of VR environments.https://ecommons.udayton.edu/stander_posters/1214/thumbnail.jp

    The influence of restricted viewing conditions on egocentric distance perception: implications for real and virtual environments

    Get PDF
    technical reportThree experiments examined the influence of field of view and binocular viewing restrictions on absolute distance perception in the real world. Previous work has found that visually directed walking tasks reveal accurate distance estimations in full-cue, real world environments to distances of about 20 meters. In contrast, the same tasks in virtual environments using headmounted displays (HMDs) show large compression of distance. Field of view and binocular viewing are common limitations in research with HMDs and have been rarely studied under full pictorial-cue conditions in the context of distance perception in the real world. Experiment 1 determined that the view of one?s body and feet on the floor was not necessary for accurate distance perception. Experiment 2 manipulated horizontal field of view and head rotation, finding that a restricted field of view did not affect the accuracy of distance estimations when head movement was allowed. Experiment 3 found that performance with monocular viewing was equal to that with binocular viewing. These results have implications for the information needed to scale egocentric distance in the real world and suggest that field of view and binocular viewing restrictions do not largely contribute to the underestimation seen with HMDs

    Efficient Distance Accuracy Estimation Of Real-World Environments In Virtual Reality Head-Mounted Displays

    Get PDF
    Virtual reality (VR) is a very promising technology with many compelling industrial applications. As many advancements have been made recently to deploy and use VR technology in virtual environments, they are still less mature to be used to render real environments. The current VR systems settings, which are developed for virtual environments rendering, fail to adequately address the challenges of capturing and displaying real-world virtual reality that these systems entail. Before these systems can be used in real life settings, their performance needs to be investigated, more specifically, depth perception and how distances to objects in the rendered scenes are estimated. The perceived depth is influenced by Head Mounted Displays (HMD) that inevitability decrease the virtual content’s depth perception. Distances are consistently underestimated in virtual environments (VEs) compared to the real world. The reason behind this underestimation is still not understood. This thesis investigates another version of this kind of system, that to the best of authors knowledge has not been explored by any previous research. Previous research used a computer-generated scene. This work is examining distance estimation in real environments rendered to Head-Mounted Displays, where distance estimations is among the most challenging issues that are still investigated and not fully understood.This thesis introduces a dual-camera video feed system through a virtual reality head mounted display with two models: a video-based and a static photo-based model, in which, the purpose is to explore whether the misjudgment of distances in HMDs could be due to a lack of realism, or not, with the use of a real-world scene rendering system. Distance judgments performance in the real world and these two evaluated VE models were compared using protocols already proven to accurately measure real-world distance estimations. An improved model based on enhancing the field of view (FOV) of the displayed scenes to improve distance judgements when displaying real-world VR content to HMDs was developed; allowing to mitigate the limited FOV, which is among the first potential causes of distance underestimation, specially, the mismatch of FOV between the camera and the HMD field of views. The proposed model is using a set of two cameras to generate the video instead of hundreds of input cameras or tens of cameras mounted on a circular rig as previous works from the literature. First Results from the first implementation of this system found that when the model was rendered as static photo-based, the underestimation was less as compared with the live video feed rendering. The video-based (real + HMD) model and the static photo-based (real + photo + HMD) model averaged 80.2% of the actual distance, and 81.4% respectively compared to the Real-World estimations that averaged 92.4%. The improved developed approach (Real + HMD + FOV) was compared to these two models and showed an improvement of 11%, increasing the estimation accuracy from 80% to 91% and reducing the estimation error from 1.29% to 0.56%. This thesis results present strong evidence of the need for novel distance estimation improvements methods for real world VR content systems and provides effective initial work towards this goal

    人工現実感における遮蔽矛盾問題の知覚への影響

    Get PDF
    A binocular stereo display is an effective device for virtual reality. However it might bring about embarassed situation when both real environment and virtual one are visible and contradictory with each other, since the relationship of occlusion between virtual and real environments is always determined by the architecture of the device used. In this paper we show the result of the psychophysical experiments under the contradictory condition of perceived distance caused by occlusion and binocular disparity, and it is found that this might result in unstable perception to cause binocular rivalry when depth of the vertical edge of the superposed area cannot be determined absolutely. Therefore it is suggested that this problem can be avoided by adding some specific clue for unique perception

    Audio, visual, and audio-visual egocentric distance perception by moving participants in virtual environments

    No full text
    International audienceA study on audio, visual, and audio-visual egocentric distance perception by moving participants in virtual environments is presented. Audio-visual rendering is provided using tracked passive visual stereoscopy and acoustic wave fi eld synthesis (WFS). Distances are estimated using indirect blind-walking (triangulation) under each rendering condition. Experimental results show that distances perceived in the virtual environment are accurately estimated or overestimated for rendered distances closer than the position of the audio-visual rendering system and underestimated for distances farther. Interestingly, participants perceived each virtual object at a modality-independent distance when using the audio modality, the visual modality, or the combination of both. Results show WFS capable of synthesizing perceptually meaningful sound fields in terms of distance. Dynamic audio-visual cues were used by participants when estimating the distances in the virtual world. Moving may have provided participants with a better visual distance perception of close distances than if they were static. No correlation between the feeling of presence and the visual distance underestimation has been found. To explain the observed perceptual distance compression, it is proposed that, due to con flicting distance cues, the audio-visual rendering system physically anchors the virtual world to the real world. Virtual objects are thus attracted by the physical audio-visual rendering system

    Near-Field Depth Perception in Optical See-Though Augmented Reality

    Get PDF
    Augmented reality (AR) is a very promising display technology with many compelling industrial applications. However, before it can be used in actual settings, its fidelity needs to be investigated from a user-centric viewpoint. More specifically, how distance to the virtual objects is perceived in augmented reality is still an open question. To the best of our knowledge, there are only four previous studies that specifically studied distance perception in AR within reaching distances. Therefore, distance perception in augmented reality still remains a largely understudied phenomenon. This document presents research in depth perception in augmented reality in the near visual field. The specific goal of this research is to empirically study various measurement techniques for depth perception, and to study various factors that affect depth perception in augmented reality, specifically, eye accommodation, brightness, and participant age. This document discusses five experiments that have already been conducted. Experiment I aimed to determine if there are inherent difference between the perception of virtual and real objects by comparing depth judgments using two complementary distance judgment protocols: perceptual matching and blind reaching. This experiment found that real objects are perceived more accurately than virtual objects and matching is a relatively more accurate distance measure than reaching. Experiment II compared the two distance judgment protocols in the real world and augmented reality environments, with improved proprioceptive and visual feedback. This experiment found that reaching responses in the AR environment became more accurate with improved feedback. Experiment III studied the effect of different levels of accommodative demand (collimated, consistent, and midpoint) on distance judgments. This experiment found nearly accurate distance responses in the consistent and midpoint conditions, and a linear increase in error in the collimated condition. Experiment IV studied the effect of brightness of the target object on depth judgments. This experiment found that distance responses were shifted towards background for the dim AR target. Lastly, Experiment V studied the effect of participant age on depth judgments and found that older participants judged distance more accurately than younger participants. Taken together, these five experiments will help us understand how depth perception operates in augmented reality
    corecore