5,923 research outputs found

    Effects of virtual acoustics on dynamic auditory distance perception

    Get PDF
    Sound propagation encompasses various acoustic phenomena including reverberation. Current virtual acoustic methods, ranging from parametric filters to physically-accurate solvers, can simulate reverberation with varying degrees of fidelity. We investigate the effects of reverberant sounds generated using different propagation algorithms on acoustic distance perception, i.e., how faraway humans perceive a sound source. In particular, we evaluate two classes of methods for real-time sound propagation in dynamic scenes based on parametric filters and ray tracing. Our study shows that the more accurate method shows less distance compression as compared to the approximate, filter-based method. This suggests that accurate reverberation in VR results in a better reproduction of acoustic distances. We also quantify the levels of distance compression introduced by different propagation methods in a virtual environment.Comment: 8 Pages, 7 figure

    More than just perception-action recalibration: walking through a virtual environment causes rescaling of perceived space.

    Get PDF
    Egocentric distances in virtual environments are commonly underperceived by up to 50 % of the intended distance. However, a brief period of interaction in which participants walk through the virtual environment while receiving visual feedback can dramatically improve distance judgments. Two experiments were designed to explore whether the increase in postinteraction distance judgments is due to perception–action recalibration or the rescaling of perceived space. Perception–action recalibration as a result of walking interaction should only affect action-specific distance judgments, whereas rescaling of perceived space should affect all distance judgments based on the rescaled percept. Participants made blind-walking distance judgments and verbal size judgments in response to objects in a virtual environment before and after interacting with the environment through either walking (Experiment 1) or reaching (Experiment 2). Size judgments were used to infer perceived distance under the assumption of size–distance invariance, and these served as an implicit measure of perceived distance. Preinteraction walking and size-based distance judgments indicated an underperception of egocentric distance, whereas postinteraction walking and size-based distance judgments both increased as a result of the walking interaction, indicating that walking through the virtual environment with continuous visual feedback caused rescaling of the perceived space. However, interaction with the virtual environment through reaching had no effect on either type of distance judgment, indicating that physical translation through the virtual environment may be necessary for a rescaling of perceived space. Furthermore, the size-based distance and walking distance judgments were highly correlated, even across changes in perceived distance, providing support for the size–distance invariance hypothesis

    Effects of Interaction with an Immersive Virtual Environment on Near-field Distance Estimates

    Get PDF
    Distances are regularly underestimated in immersive virtual environments (IVEs) (Witmer & Kline, 1998; Loomis & Knapp, 2003). Few experiments, however, have examined the ability of calibration to overcome distortions of depth perception in IVEs. This experiment is designed to examine the effect of calibration via haptic and visual feedback on distance estimates in an IVE. Participants provided verbal and reaching distance estimates during three sessions; a baseline measure without feedback, a calibration session with visual and haptic feedback, and finally a post-calibration session without feedback. Feedback was shown to calibrate distance estimates within an IVE. Discussion focused on the possibility that costly solutions and research endeavors seeking to remedy the compression of distances may become less necessary if users are simply given the opportunity to use manual activity to calibrate to the IVE

    The Underestimation Of Egocentric Distance: Evidence From Frontal Matching Tasks

    Get PDF
    There is controversy over the existence, nature, and cause of error in egocentric distance judgments. One proposal is that the systematic biases often found in explicit judgments of egocentric distance along the ground may be related to recently observed biases in the perceived declination of gaze (Durgin & Li, Attention, Perception, & Psychophysics, in press), To measure perceived egocentric distance nonverbally, observers in a field were asked to position themselves so that their distance from one of two experimenters was equal to the frontal distance between the experimenters. Observers placed themselves too far away, consistent with egocentric distance underestimation. A similar experiment was conducted with vertical frontal extents. Both experiments were replicated in panoramic virtual reality. Perceived egocentric distance was quantitatively consistent with angular bias in perceived gaze declination (1.5 gain). Finally, an exocentric distance-matching task was contrasted with a variant of the egocentric matching task. The egocentric matching data approximate a constant compression of perceived egocentric distance with a power function exponent of nearly 1; exocentric matches had an exponent of about 0.67. The divergent pattern between egocentric and exocentric matches suggests that they depend on different visual cues

    Effects of Clutter on Egocentric Distance Perception in Virtual Reality

    Full text link
    To assess the impact of clutter on egocentric distance perception, we performed a mixed-design study with 60 participants in four different virtual environments (VEs) with three levels of clutter. Additionally, we compared the indoor/outdoor VE characteristics and the HMD's FOV. The participants wore a backpack computer and a wide FOV head-mounted display (HMD) as they blind-walked towards three distinct targets at distances of 3m, 4.5m, and 6m. The HMD's field of view (FOV) was programmatically limited to 165{\deg}×\times110{\deg}, 110{\deg}×\times110{\deg}, or 45{\deg}×\times35{\deg}. The results showed that increased clutter in the environment led to more precise distance judgment and less underestimation, independent of the FOV. In comparison to outdoor VEs, indoor VEs showed more accurate distance judgment. Additionally, participants made more accurate judgements while looking at the VEs through wider FOVs.Comment: This paper was not published yet in any venue or conference/journal, ACM conference format was used for the paper, authors were listed in order from first to last (advisor), 10 pages, 10 figure

    Viewing medium affects arm motor performance in 3D virtual environments

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>2D and 3D virtual reality platforms are used for designing individualized training environments for post-stroke rehabilitation. Virtual environments (VEs) are viewed using media like head mounted displays (HMDs) and large screen projection systems (SPS) which can influence the quality of perception of the environment. We estimated if there were differences in arm pointing kinematics when subjects with and without stroke viewed a 3D VE through two different media: HMD and SPS.</p> <p>Methods</p> <p>Two groups of subjects participated (healthy control, n = 10, aged 53.6 ± 17.2 yrs; stroke, n = 20, 66.2 ± 11.3 yrs). Arm motor impairment and spasticity were assessed in the stroke group which was divided into mild (n = 10) and moderate-to-severe (n = 10) sub-groups based on Fugl-Meyer Scores. Subjects pointed (8 times each) to 6 randomly presented targets located at two heights in the ipsilateral, middle and contralateral arm workspaces. Movements were repeated in the same VE viewed using HMD (Kaiser XL50) and SPS. Movement kinematics were recorded using an Optotrak system (Certus, 6 markers, 100 Hz). Upper limb motor performance (precision, velocity, trajectory straightness) and movement pattern (elbow, shoulder ranges and trunk displacement) outcomes were analyzed using repeated measures ANOVAs.</p> <p>Results</p> <p>For all groups, there were no differences in endpoint trajectory straightness, shoulder flexion and shoulder horizontal adduction ranges and sagittal trunk displacement between the two media. All subjects, however, made larger errors in the vertical direction using HMD compared to SPS. Healthy subjects also made larger errors in the sagittal direction, slower movements overall and used less range of elbow extension for the lower central target using HMD compared to SPS. The mild and moderate-to-severe sub-groups made larger RMS errors with HMD. The only advantage of using the HMD was that movements were 22% faster in the moderate-to-severe stroke sub-group compared to the SPS.</p> <p>Conclusions</p> <p>Despite the similarity in majority of the movement kinematics, differences in movement speed and larger errors were observed for movements using the HMD. Use of the SPS may be a more comfortable and effective option to view VEs for upper limb rehabilitation post-stroke. This has implications for the use of VR applications to enhance upper limb recovery.</p

    Action And Motivation: Measuring Perception Or Strategies?

    Get PDF
    It has been suggested that when judging the distance to a desirable object, motivated distortions of perceived distance occur, and that these distortions can be measured by actions, such as throwing a beanbag. The results of two new experiments suggest that reported variations in beanbag performance may instead depend on instructional effects, such as ones that emphasize proximity rather than accuracy. When the goal was to be closest to the target, underthrowing was observed, whether the target was intrinsically valuable or not. When the goal was to hit the target, however, throwing performance was unbiased

    The impact of background and context on car distance estimation

    Get PDF
    It is well established that people underestimate the distance to objects depicted in virtual environments and two-dimensional (2D) displays. The reasons for the underestimation are still not fully understood. It is becoming more common to use virtual environment displays for driver training and testing and so understanding the distortion of perceived space that occurs in these displays is vital. We need to know what aspects of the display cause the observer to misperceive the distance to objects in the simulated environments. The research reported in this thesis investigated how people estimate distance between themselves and a car in front of them, within a number of differing environmental contexts. Four experiments were run using virtual environment displays of various kinds and a fifth experiment was run in a real-world setting. It was found that distance underestimation when viewing 2D displays is very common, even when familiar objects such as cars are used as the targets. The experiments also verified that people have a greater underestimation of distance in a virtual environment compared to a real-world setting. A surprising and somewhat counterintuitive result was that people underestimate distance more when the scene depicts forward motion of the observer compared to a static view. The research also identified a number of visual features in the display (e.g., texture information) and aspects of the display (e.g., field of view) that affected the perception of distance or that had no effect. The findings should help the designers of driver-training simulators and testing equipment to better understand the types of errors that can potentially occur when humans view two-dimensional virtual environment displays

    Perceived Space in the HTC Vive

    Get PDF
    Underperception of egocentric distance in virtual reality has been a persistent concern for almost 20 years. Modern headmounted displays (HMDs) appear to have begun to ameliorate underperception. The current study examined several aspects of perceived space in the HTC Vive. Blind-walking distance judgments, verbal distance judgments, and size judgments were measured in two distinct virtual environments (VEs)—a high-quality replica of a real classroom and an empty grass field—as well as the real classroom upon which the classroom VE was modeled. A brief walking interaction was also examined as an intervention for improving anticipated underperception in the VEs. Results from the Vive were compared to existing data using two older HMDs (nVisor SX111 and ST50). Blind-walking judgments were more accurate in the Vive compared to the older displays, and did not differ substantially from the real world nor across VEs. Size judgments were more accurate in the classroom VE than the grass VE and in the Vive compared to the older displays. Verbal judgments were significantly smaller in the classroom VE compared to the real classroom and did not significantly differ across VEs. Blind-walking and size judgments were more accurate after walking interaction, but verbal judgments were unaffected. The results indicate that underperception of distance in the HTC Vive is less than in older displays but has not yet been completely resolved. With more accurate space perception afforded by modern HMDs, alternative methods for improving judgments of perceived space—such as walking interaction—may no longer be necessary

    Blind Direct Walking Distance Judgment Research: A Best Practices Guide

    Get PDF
    Over the last 30 years, Virtual Reality (VR) research has shown that distance perception in VR is compressed as compared to the real world. The full reason for this is yet unknown. Though many experiments have been run to study the underlying reasons for this compression, often with similar procedures, the experimental details either show significant variation between experiments or go unreported. This makes it difficult to accurately repeat or compare experiments, as well as negatively impacts new researchers trying to learn and follow current best practices. In this paper, we present a review of past research and things that are typically left unreported. Using this and the practices of my advisor as evidence, we suggest a standard to assist researchers in performing quality research pertaining to blind direct walking distance judgments in VR
    corecore