931 research outputs found

    The Effect of Graphic Quality in Virtual Environments on the Perception of Egocentric and Exocentric Distances

    Get PDF
    Virtual realities (VRs), also known as virtual environments, have been used to simulate physical presence in real environments (i.e., simulations for pilot training) as well as imaginary places (i.e., videogames). Mostly constructed as visual experiences, innovations in VR technologies now include additional sensory information, such as sound and touch, and have allowed for collaborations across diverse fields, including skills training, ergonomics, therapeutic programs, perception and cognitive psychology. Virtual realities in a therapeutic role have been applied to numerous forms of exposure therapy to address phobias such as claustrophobia, agoraphobia, and acrophobia (fear of heights), as well as post-traumatic stress disorder (PTSD) and anxiety disorders. Virtual reality methodology has also been used in physical therapy, occupational therapy, and physical rehabilitation. Moreover, research has been comprehensive in addressing the participant\u27s perceptual reaction to the VR environment and has addressed the effect of the quality of the graphics of the VR environment on judging spatial egocentric distances (i.e., distances between the participant\u27s virtual self and objects in the VR environment) and exocentric distances (i.e., distances between various objects in the VR environment). For example, participants in head-mounted-display-(HMD-)based immersive VR environments consistently underestimated egocentric distances walked to previously viewed targets in both low- and high-quality VR environments compared to estimates done in real-world environments. Interestingly, participants were more accurate in verbally reporting the distances in high-quality VR environments (Kunz et al., 2009). This dissociation between magnitude estimates of target distance and action-based indicators of perceived distance (i.e., walking to previously-viewed objects) will be further explored in the present research by using other kinds of distance estimates and judgments of egocentric distances, as well as exocentric distances. This research has implications in the use of distance perception strategies in the context of VR environments.https://ecommons.udayton.edu/stander_posters/1214/thumbnail.jp

    Phenomenal regression to the real object in physical and virtual worlds

    Get PDF
    © 2014, Springer-Verlag London. In this paper, we investigate a new approach to comparing physical and virtual size and depth percepts that captures the involuntary responses of participants to different stimuli in their field of view, rather than relying on their skill at judging size, reaching or directed walking. We show, via an effect first observed in the 1930s, that participants asked to equate the perspective projections of disc objects at different distances make a systematic error that is both individual in its extent and comparable in the particular physical and virtual setting we have tested. Prior work has shown that this systematic error is difficult to correct, even when participants are knowledgeable of its likelihood of occurring. In fact, in the real world, the error only reduces as the available cues to depth are artificially reduced. This makes the effect we describe a potentially powerful, intrinsic measure of VE quality that ultimately may contribute to our understanding of VE depth compression phenomena

    An Immersive Telepresence System using RGB-D Sensors and Head Mounted Display

    Get PDF
    We present a tele-immersive system that enables people to interact with each other in a virtual world using body gestures in addition to verbal communication. Beyond the obvious applications, including general online conversations and gaming, we hypothesize that our proposed system would be particularly beneficial to education by offering rich visual contents and interactivity. One distinct feature is the integration of egocentric pose recognition that allows participants to use their gestures to demonstrate and manipulate virtual objects simultaneously. This functionality enables the instructor to ef- fectively and efficiently explain and illustrate complex concepts or sophisticated problems in an intuitive manner. The highly interactive and flexible environment can capture and sustain more student attention than the traditional classroom setting and, thus, delivers a compelling experience to the students. Our main focus here is to investigate possible solutions for the system design and implementation and devise strategies for fast, efficient computation suitable for visual data processing and network transmission. We describe the technique and experiments in details and provide quantitative performance results, demonstrating our system can be run comfortably and reliably for different application scenarios. Our preliminary results are promising and demonstrate the potential for more compelling directions in cyberlearning.Comment: IEEE International Symposium on Multimedia 201

    On The Anisotropy Of Perceived Ground Extents And The Interpretation Of Walked Distance As A Measure Of Perception

    Get PDF
    Two experiments are reported concerning the perception of ground extent to discover whether prior reports of anisotropy between frontal extents and extents in depth were consistent across different measures (visual matching and pantomime walking) and test environments (outdoor environments and virtual environments). In Experiment 1 it was found that depth extents of up to 7 m are indeed perceptually compressed relative to frontal extents in an outdoor environment, and that perceptual matching provided more precise estimates than did pantomime walking. In Experiment 2, similar anisotropies were found using similar tasks in a similar (but virtual) environment. In both experiments pantomime walking measures seemed to additionally compress the range of responses. Experiment 3 supported the hypothesis that range compression in walking measures of perceived distance might be due to proactive interference (memory contamination). It is concluded that walking measures are calibrated for perceived egocentric distance, but that pantomime walking measures may suffer range compression. Depth extents along the ground are perceptually compressed relative to frontal ground extents in a manner consistent with the angular scale expansion hypothesis. (PsycINFO Database Record (c) 2013 APA, all rights reserved)(journal abstract

    The influence of restricted viewing conditions on egocentric distance perception: implications for real and virtual environments

    Get PDF
    technical reportThree experiments examined the influence of field of view and binocular viewing restrictions on absolute distance perception in the real world. Previous work has found that visually directed walking tasks reveal accurate distance estimations in full-cue, real world environments to distances of about 20 meters. In contrast, the same tasks in virtual environments using headmounted displays (HMDs) show large compression of distance. Field of view and binocular viewing are common limitations in research with HMDs and have been rarely studied under full pictorial-cue conditions in the context of distance perception in the real world. Experiment 1 determined that the view of one?s body and feet on the floor was not necessary for accurate distance perception. Experiment 2 manipulated horizontal field of view and head rotation, finding that a restricted field of view did not affect the accuracy of distance estimations when head movement was allowed. Experiment 3 found that performance with monocular viewing was equal to that with binocular viewing. These results have implications for the information needed to scale egocentric distance in the real world and suggest that field of view and binocular viewing restrictions do not largely contribute to the underestimation seen with HMDs

    The Underestimation Of Egocentric Distance: Evidence From Frontal Matching Tasks

    Get PDF
    There is controversy over the existence, nature, and cause of error in egocentric distance judgments. One proposal is that the systematic biases often found in explicit judgments of egocentric distance along the ground may be related to recently observed biases in the perceived declination of gaze (Durgin & Li, Attention, Perception, & Psychophysics, in press), To measure perceived egocentric distance nonverbally, observers in a field were asked to position themselves so that their distance from one of two experimenters was equal to the frontal distance between the experimenters. Observers placed themselves too far away, consistent with egocentric distance underestimation. A similar experiment was conducted with vertical frontal extents. Both experiments were replicated in panoramic virtual reality. Perceived egocentric distance was quantitatively consistent with angular bias in perceived gaze declination (1.5 gain). Finally, an exocentric distance-matching task was contrasted with a variant of the egocentric matching task. The egocentric matching data approximate a constant compression of perceived egocentric distance with a power function exponent of nearly 1; exocentric matches had an exponent of about 0.67. The divergent pattern between egocentric and exocentric matches suggests that they depend on different visual cues

    An experimental comparison of perceived egocentric distance in real, image-based, and traditional virtual environment using direct walking tasks

    Get PDF
    technical reportIn virtual environments, perceived egocentric distances are often underestimated when compared to the same distance judgments in the real world. The research presented in this paper explores two possible causes for this reduced distance perception in virtual environments: (1) real-time computer graphics rendering, and (2) immersive display technology. Our experiment compared egocentric distance judgments in three complex, indoor environments: a real hallway with full-cue conditions; a virtual, stereoscopic, photographic panorama; and a virtual, stereoscopic computer model. Perceived egocentric distance was determined by a directed walking task in which subjects walk blindfolded to the target. Our results show there is a significant difference in distance judgments between real and virtual environments. However, the differences between distance judgments in virtual photographic panorama environments and traditionally rendered virtual environments are small, suggesting that the display device is affecting distance judgments in virtual environments

    Efficient Distance Accuracy Estimation Of Real-World Environments In Virtual Reality Head-Mounted Displays

    Get PDF
    Virtual reality (VR) is a very promising technology with many compelling industrial applications. As many advancements have been made recently to deploy and use VR technology in virtual environments, they are still less mature to be used to render real environments. The current VR systems settings, which are developed for virtual environments rendering, fail to adequately address the challenges of capturing and displaying real-world virtual reality that these systems entail. Before these systems can be used in real life settings, their performance needs to be investigated, more specifically, depth perception and how distances to objects in the rendered scenes are estimated. The perceived depth is influenced by Head Mounted Displays (HMD) that inevitability decrease the virtual content’s depth perception. Distances are consistently underestimated in virtual environments (VEs) compared to the real world. The reason behind this underestimation is still not understood. This thesis investigates another version of this kind of system, that to the best of authors knowledge has not been explored by any previous research. Previous research used a computer-generated scene. This work is examining distance estimation in real environments rendered to Head-Mounted Displays, where distance estimations is among the most challenging issues that are still investigated and not fully understood.This thesis introduces a dual-camera video feed system through a virtual reality head mounted display with two models: a video-based and a static photo-based model, in which, the purpose is to explore whether the misjudgment of distances in HMDs could be due to a lack of realism, or not, with the use of a real-world scene rendering system. Distance judgments performance in the real world and these two evaluated VE models were compared using protocols already proven to accurately measure real-world distance estimations. An improved model based on enhancing the field of view (FOV) of the displayed scenes to improve distance judgements when displaying real-world VR content to HMDs was developed; allowing to mitigate the limited FOV, which is among the first potential causes of distance underestimation, specially, the mismatch of FOV between the camera and the HMD field of views. The proposed model is using a set of two cameras to generate the video instead of hundreds of input cameras or tens of cameras mounted on a circular rig as previous works from the literature. First Results from the first implementation of this system found that when the model was rendered as static photo-based, the underestimation was less as compared with the live video feed rendering. The video-based (real + HMD) model and the static photo-based (real + photo + HMD) model averaged 80.2% of the actual distance, and 81.4% respectively compared to the Real-World estimations that averaged 92.4%. The improved developed approach (Real + HMD + FOV) was compared to these two models and showed an improvement of 11%, increasing the estimation accuracy from 80% to 91% and reducing the estimation error from 1.29% to 0.56%. This thesis results present strong evidence of the need for novel distance estimation improvements methods for real world VR content systems and provides effective initial work towards this goal
    • …
    corecore