4 research outputs found

    Substitutional Reality System: A Novel Experimental Platform for Experiencing Alternative Reality

    Get PDF
    We have developed a novel experimental platform, referred to as a substitutional reality (SR) system, for studying the conviction of the perception of live reality and related metacognitive functions. The SR system was designed to manipulate people's reality by allowing them to experience live scenes (in which they were physically present) and recorded scenes (which were recorded and edited in advance) in an alternating manner without noticing a reality gap. All of the naïve participants (n = 21) successfully believed that they had experienced live scenes when recorded scenes had been presented. Additional psychophysical experiments suggest the depth of visual objects does not affect the perceptual discriminability between scenes, and the scene switch during head movement enhance substitutional performance. The SR system, with its reality manipulation, is a novel and affordable method for studying metacognitive functions and psychiatric disorders

    Scene-motion thresholds during head yaw for immersive virtual environments

    Get PDF
    In order to better understand how scene motion is perceived in immersive virtual environments, we measured scene-motion thresholds under different conditions across three experiments. Thresholds were measured during quasi-sinusoidal head yaw, single left-to-right or right-to-left head yaw, different phases of head yaw, slow to fast head yaw, scene motion relative to head yaw, and two scene illumination levels. We found that across various conditions 1) thresholds are greater when the scene moves with head yaw (corresponding to gain 1:0), and 2) thresholds increase as head motion increases

    Scene-Motion Thresholds Correlate with Angular Head Motions for Immersive Virtual Environments

    No full text

    Scene-motion- and latency-perception thresholds for head-mounted displays

    Get PDF
    A fundamental task of an immersive virtual environment (IVE) system is to present images of the virtual world that change appropriately as the user's head moves. Current IVE systems, especially those using head-mounted displays (HMDs), often produce spatially unstable scenes, resulting in simulator sickness, degraded task performance, degraded visual acuity, and breaks in presence. In HMDs, instability resulting from latency is greater than all other causes of instability combined. The primary way users perceive latency in an HMD is by improper motion of scenes that should be stationary in the world. Whereas latency-induced scene motion is well defined mathematically, less is understood about how much scene motion and/or latency can occur without subjects noticing, and how this varies under different conditions. I built a simulated HMD system with zero effective latency---no scene motion occurs due to latency. I intentionally and artificially inserted scene motion into the virtual environment in order to determine how much scene motion and/or latency can occur without subjects noticing. I measured perceptual thresholds of scene-motion and latency under different conditions across five experiments. Based on the study of latency, head motion, scene motion, and perceptual thresholds, I developed a mathematical model of latency thresholds as an inverse function of peak head-yaw acceleration. Psychophysics studies showed that measured latency thresholds correlate with this inverse function better than with a linear function. The work reported here readily enables scientists and engineers to, under their particular conditions, measure latency thresholds as a function of head motion by using an off-the-shelf projector system. Latency requirements can thus be determined before designing HMD systems
    corecore