4,743 research outputs found

    Effects of virtual acoustics on dynamic auditory distance perception

    Get PDF
    Sound propagation encompasses various acoustic phenomena including reverberation. Current virtual acoustic methods, ranging from parametric filters to physically-accurate solvers, can simulate reverberation with varying degrees of fidelity. We investigate the effects of reverberant sounds generated using different propagation algorithms on acoustic distance perception, i.e., how faraway humans perceive a sound source. In particular, we evaluate two classes of methods for real-time sound propagation in dynamic scenes based on parametric filters and ray tracing. Our study shows that the more accurate method shows less distance compression as compared to the approximate, filter-based method. This suggests that accurate reverberation in VR results in a better reproduction of acoustic distances. We also quantify the levels of distance compression introduced by different propagation methods in a virtual environment.Comment: 8 Pages, 7 figure

    Effects of Clutter on Egocentric Distance Perception in Virtual Reality

    Full text link
    To assess the impact of clutter on egocentric distance perception, we performed a mixed-design study with 60 participants in four different virtual environments (VEs) with three levels of clutter. Additionally, we compared the indoor/outdoor VE characteristics and the HMD's FOV. The participants wore a backpack computer and a wide FOV head-mounted display (HMD) as they blind-walked towards three distinct targets at distances of 3m, 4.5m, and 6m. The HMD's field of view (FOV) was programmatically limited to 165{\deg}Ă—\times110{\deg}, 110{\deg}Ă—\times110{\deg}, or 45{\deg}Ă—\times35{\deg}. The results showed that increased clutter in the environment led to more precise distance judgment and less underestimation, independent of the FOV. In comparison to outdoor VEs, indoor VEs showed more accurate distance judgment. Additionally, participants made more accurate judgements while looking at the VEs through wider FOVs.Comment: This paper was not published yet in any venue or conference/journal, ACM conference format was used for the paper, authors were listed in order from first to last (advisor), 10 pages, 10 figure

    Perceived Space in the HTC Vive

    Get PDF
    Underperception of egocentric distance in virtual reality has been a persistent concern for almost 20 years. Modern headmounted displays (HMDs) appear to have begun to ameliorate underperception. The current study examined several aspects of perceived space in the HTC Vive. Blind-walking distance judgments, verbal distance judgments, and size judgments were measured in two distinct virtual environments (VEs)—a high-quality replica of a real classroom and an empty grass field—as well as the real classroom upon which the classroom VE was modeled. A brief walking interaction was also examined as an intervention for improving anticipated underperception in the VEs. Results from the Vive were compared to existing data using two older HMDs (nVisor SX111 and ST50). Blind-walking judgments were more accurate in the Vive compared to the older displays, and did not differ substantially from the real world nor across VEs. Size judgments were more accurate in the classroom VE than the grass VE and in the Vive compared to the older displays. Verbal judgments were significantly smaller in the classroom VE compared to the real classroom and did not significantly differ across VEs. Blind-walking and size judgments were more accurate after walking interaction, but verbal judgments were unaffected. The results indicate that underperception of distance in the HTC Vive is less than in older displays but has not yet been completely resolved. With more accurate space perception afforded by modern HMDs, alternative methods for improving judgments of perceived space—such as walking interaction—may no longer be necessary

    Action And Motivation: Measuring Perception Or Strategies?

    Get PDF
    It has been suggested that when judging the distance to a desirable object, motivated distortions of perceived distance occur, and that these distortions can be measured by actions, such as throwing a beanbag. The results of two new experiments suggest that reported variations in beanbag performance may instead depend on instructional effects, such as ones that emphasize proximity rather than accuracy. When the goal was to be closest to the target, underthrowing was observed, whether the target was intrinsically valuable or not. When the goal was to hit the target, however, throwing performance was unbiased

    Effects of Interaction with an Immersive Virtual Environment on Near-field Distance Estimates

    Get PDF
    Distances are regularly underestimated in immersive virtual environments (IVEs) (Witmer & Kline, 1998; Loomis & Knapp, 2003). Few experiments, however, have examined the ability of calibration to overcome distortions of depth perception in IVEs. This experiment is designed to examine the effect of calibration via haptic and visual feedback on distance estimates in an IVE. Participants provided verbal and reaching distance estimates during three sessions; a baseline measure without feedback, a calibration session with visual and haptic feedback, and finally a post-calibration session without feedback. Feedback was shown to calibrate distance estimates within an IVE. Discussion focused on the possibility that costly solutions and research endeavors seeking to remedy the compression of distances may become less necessary if users are simply given the opportunity to use manual activity to calibrate to the IVE

    The visual perception of distance in action space.

    Get PDF
    This work examines our perception of distance within action space (about 2m ~ 30m), an ability that is important for various actions. Two general problems are addressed: what information can be used to judge distance accurately and how is it processed? The dissertation is in two parts. The first part considers the what question. Subjects\u27 distance judgment was examined in real, altered and virtual environments by using perceptual tasks or actions to assess the role of a variety of intrinsic and environmental depth cues. The findings show that the perception of angular declination, or height in the visual field, is largely veridical and a target is visually located on the projection line from the observer\u27s eyes to it. It is also shown that a continuous ground texture is essential for veridical space perception. Of multiple textural cues, linear perspective is a strong cue for representing the ground and hence judging distance but compression is a relatively ineffective cue. In the second part, the sequential surface integration process (SSIP) hypothesis is proposed to understand the processing of depth information. The hypothesis asserts that an accurate representation of the ground surface is critical for veridical space perception and a global ground representation is formed by an integrative process that samples and combines local information over space and time. Confirming this, the experiments found that information from an extended ground area is necessary for judging distance accurately and distance was underestimated when an observer\u27s view was restricted to the local ground area about the target. The SSIP hypothesis also suggests that, to build an accurate ground representation, the integrative process might start from near space where rich depth cues can provide for a reliable initial representation and then progressively extend to distant areas. This is also confirmed by the finding that subjects could judge distance accurately by scanning local patches of the ground surface from near to far, but not in the reverse direction

    Representing 3D Space in Working Memory: Spatial Images from Vision, Hearing, Touch, and Language

    Get PDF
    The chapter deals with a form of transient spatial representation referred to as a spatial image. Like a percept, it is externalized, scaled to the environment, and can appear in any direction about the observer. It transcends the concept of modality, as it can be based on inputs from the three spatial senses, from language, and from long-term memory. Evidence is presented that supports each of the claimed properties of the spatial image, showing that it is quite different from a visual image. Much of the evidence presented is based on spatial updating. A major concern is whether spatial images from different input modalities are functionally equivalent— that once instantiated in working memory, the spatial images from different modalities have the same functional characteristics with respect to subsequent processing, such as that involved in spatial updating. Going further, the research provides some evidence that spatial images are amodal (i.e., do not retain modality-specific features)

    THE IMPACT OF PRE-EXPERIMENT WALKING ON DISTANCE PERCEPTION IN VR

    Get PDF
    While individuals can accurately estimate distances in the real world, this ability is often diminished in virtual reality (VR) simulations, hampering performance across training, entertainment, prototyping, and education domains. To assess distance judgments, the direct blind walking method—having participants walk blindfolded to targets—is frequently used. Typically, direct blind walking measurements are performed after an initial practice phase, where people become comfortable with walking while blindfolded. Surprisingly, little research has explored how such pre-experiment walking impacts subsequent VR distance judgments. Our initial investigation revealed increased pre-experiment blind walking reduced distance underestimations, underscoring the importance of detailing these preparatory procedures in research—details often overlooked. In a follow-up study, we found that eyes-open walking prior to pre-experiment blind walking did not influence results, while extensive pre-experiment blind walking led to overestimation. Additionally, see-through walking had a slightly greater impact and less underestimation compared to one loop of pre-experiment blind walking. Our comprehensive research deepens our understanding of how pre-experiment methodologies influence distance judgments in VR, guides future research protocols, and elucidates the mechanics of distance estimation within virtual reality

    Distance Perception Through Head-Mounted Displays

    Get PDF
    It has been shown in numerous research studies that people tend to underestimate distances while wearing head-mounted displays (HMDs). We investigated various possible factors affecting the perception of distance is HMDs through multiple studies. Many contributing factors has been identified by researchers in the past decades, however, further investigation is required to provide a better understanding of this problem. In order to find a baseline for distance underestimation, we performed a study to compare the distance perception in real world versus a fake headset versus a see-through HMD. Users underestimated distances while wearing the fake headset or the see-through HMD. The fake headset and see-through HMD had similar result, while they had significant difference with the real-world results. Since the fake headset and the HMD had similar underestimation results, we decided to focus on the FOV of the headset which was a common factor between these two conditions. To understand the effects of FOV on the perception of distance in a virtual environment we performed a study through a blind-throwing task. FOVs at three different diagonal angles, 60°, 110° and 200° were compared with each other. The results showed people underestimate the distances more in restricted FOVs. As this study was performed using static 360° images of a single environment, we decided to see if the results can be extended to various 3D environments. A mixed-design study to compare the effect of horizontal FOV and vertical FOV on egocentric distance perception in four different realistic VEs was performed. The results indicated more accurate distance judgement with larger horizontal FOV with no significant effect of vertical FOV. More accurate distance judgement in indoor VEs compared to outdoor VEs was observed. Also, participants judged distances more accurately in cluttered environments versus uncluttered environments. These results highlights the importance of the environment in distance-critical VR applications and also shows that wider horizontal FOV should be considered for an improved distance judgment

    The Effects of Head-Centric Rest Frames on Egocentric Distance Perception in Virtual Reality

    Get PDF
    It has been shown through several research investigations that users tend to underestimate distances in virtual reality (VR). Virtual objects that appear close to users wearing a Head-mounted display (HMD) might be located at a farther distance in reality. This discrepancy between the actual distance and the distance observed by users in VR was found to hinder users from benefiting from the full in-VR immersive experience, and several efforts have been directed toward finding the causes and developing tools that mitigate this phenomenon. One hypothesis that stands out in the field of spatial perception is the rest frame hypothesis (RFH), which states that visual frames of reference (RFs), defined as fixed reference points of view in a virtual environment (VE), contribute to minimizing sensory mismatch. RFs have been shown to promote better eye-gaze stability and focus, reduce VR sickness, and improve visual search, along with other benefits. However, their effect on distance perception in VEs has not been evaluated. To explore and better understand the potential effects that RFs can have on distance perception in VR, we used a blind walking task to explore the effect of three head-centric RFs (a mesh mask, a nose, and a hat) on egocentric distance estimation. We performed a mixed-design study where we compared the effect of each of our chosen RFs across different environmental conditions and target distances in different 3D environments. We found that at near and mid-field distances, certain RFs can improve the user\u27s distance estimation accuracy and reduce distance underestimation. Additionally, we found that participants judged distance more accurately in cluttered environments compared to uncluttered environments. Our findings show that the characteristics of the 3D environment are important in distance estimation-dependent tasks in VR and that the addition of head-centric RFs, a simple avatar augmentation method, can lead to meaningful improvements in distance judgments, user experience, and task performance in VR
    • …
    corecore