4,553 research outputs found

    Towards transparent telepresence

    Get PDF
    It is proposed that the concept of transparent telepresence can be closely approached through high fidelity technological mediation. It is argued that the matching of the system capabilities to those of the human user will yield a strong sense of immersion and presence at a remote site. Some applications of such a system are noted. The concept is explained and critical system elements are described together with an overview of some of the necessary system specifications

    I Am The Passenger: How Visual Motion Cues Can Influence Sickness For In-Car VR

    Get PDF
    This paper explores the use of VR Head Mounted Displays (HMDs) in-car and in-motion for the first time. Immersive HMDs are becoming everyday consumer items and, as they offer new possibilities for entertainment and productivity, people will want to use them during travel in, for example, autonomous cars. However, their use is confounded by motion sickness caused in-part by the restricted visual perception of motion conflicting with physically perceived vehicle motion (accelerations/rotations detected by the vestibular system). Whilst VR HMDs restrict visual perception of motion, they could also render it virtually, potentially alleviating sensory conflict. To study this problem, we conducted the first on-road and in motion study to systematically investigate the effects of various visual presentations of the real-world motion of a car on the sickness and immersion of VR HMD wearing passengers. We established new baselines for VR in-car motion sickness, and found that there is no one best presentation with respect to balancing sickness and immersion. Instead, user preferences suggest different solutions are required for differently susceptible users to provide usable VR in-car. This work provides formative insights for VR designers and an entry point for further research into enabling use of VR HMDs, and the rich experiences they offer, when travelling

    Perception Of Visual Speed While Moving

    Get PDF
    During self-motion, the world normally appears stationary. In part, this may be due to reductions in visual motion signals during self-motion. In 8 experiments, the authors used magnitude estimation to characterize changes in visual speed perception as a result of biomechanical self-motion alone (treadmill walking), physical translation alone (passive transport), and both biomechanical self-motion and physical translation together (walking). Their results show that each factor alone produces subtractive reductions in visual speed but that subtraction is greatest with both factors together, approximating the sum of the 2 separately. The similarity of results for biomechanical and passive self-motion support H. B. Barlow\u27s (1990) inhibition theory of sensory correlation as a mechanism for implementing H. Wallach\u27s (1987) compensation for self-motion. (PsycINFO Database Record (c) 2013 APA, all rights reserved)(journal abstract

    Visuo-vestibular interaction in the reconstruction of travelled trajectories

    Get PDF
    We recently published a study of the reconstruction of passively travelled trajectories from optic flow. Perception was prone to illusions in a number of conditions, and not always veridical in the others. Part of the illusionary reconstructed trajectories could be explained by assuming that subjects base their reconstruction on the ego-motion percept built during the stimulus' initial moments . In the current paper, we test this hypothesis using a novel paradigm: if the final reconstruction is governed by the initial percept, providing additional, extra-retinal information that modifies the initial percept should predictably alter the final reconstruction. The extra-retinal stimulus was tuned to supplement the information that was under-represented or ambiguous in the optic flow: the subjects were physically displaced or rotated at the onset of the visual stimulus. A highly asymmetric velocity profile (high acceleration, very low deceleration) was used. Subjects were required to guide an input device (in the form of a model vehicle; we measured position and orientation) along the perceived trajectory. We show for the first time that a vestibular stimulus of short duration can influence the perception of a much longer lasting visual stimulus. Perception of the ego-motion translation component in the visual stimulus was improved by a linear physical displacement: perception of the ego-motion rotation component by a physical rotation. This led to a more veridical reconstruction in some conditions, but to a less veridical reconstruction in other conditions

    Visuo-vestibular interaction in the reconstruction of travelled trajectories

    Get PDF
    We recently published a study of the reconstruction of passively travelled trajectories from optic flow. Perception was prone to illusions in a number of conditions, and not always veridical in the others. Part of the illusionary reconstructed trajectories could be explained by assuming that subjects base their reconstruction on the ego-motion percept built during the stimulus' initial moments . In the current paper, we test this hypothesis using a novel paradigm: if the final reconstruction is governed by the initial percept, providing additional, extra-retinal information that modifies the initial percept should predictably alter the final reconstruction. The extra-retinal stimulus was tuned to supplement the information that was under-represented or ambiguous in the optic flow: the subjects were physically displaced or rotated at the onset of the visual stimulus. A highly asymmetric velocity profile (high acceleration, very low deceleration) was used. Subjects were required to guide an input device (in the form of a model vehicle; we measured position and orientation) along the perceived trajectory. We show for the first time that a vestibular stimulus of short duration can influence the perception of a much longer lasting visual stimulus. Perception of the ego-motion translation component in the visual stimulus was improved by a linear physical displacement: perception of the ego-motion rotation component by a physical rotation. This led to a more veridical reconstruction in some conditions, but to a less veridical reconstruction in other conditions

    MetaSpace II: Object and full-body tracking for interaction and navigation in social VR

    Full text link
    MetaSpace II (MS2) is a social Virtual Reality (VR) system where multiple users can not only see and hear but also interact with each other, grasp and manipulate objects, walk around in space, and get tactile feedback. MS2 allows walking in physical space by tracking each user's skeleton in real-time and allows users to feel by employing passive haptics i.e., when users touch or manipulate an object in the virtual world, they simultaneously also touch or manipulate a corresponding object in the physical world. To enable these elements in VR, MS2 creates a correspondence in spatial layout and object placement by building the virtual world on top of a 3D scan of the real world. Through the association between the real and virtual world, users are able to walk freely while wearing a head-mounted device, avoid obstacles like walls and furniture, and interact with people and objects. Most current virtual reality (VR) environments are designed for a single user experience where interactions with virtual objects are mediated by hand-held input devices or hand gestures. Additionally, users are only shown a representation of their hands in VR floating in front of the camera as seen from a first person perspective. We believe, representing each user as a full-body avatar that is controlled by natural movements of the person in the real world (see Figure 1d), can greatly enhance believability and a user's sense immersion in VR.Comment: 10 pages, 9 figures. Video: http://living.media.mit.edu/projects/metaspace-ii

    Latency Requirements for Head-Worn Display S/EVS Applications

    Get PDF
    NASA s Aviation Safety Program, Synthetic Vision Systems Project is conducting research in advanced flight deck concepts, such as Synthetic/Enhanced Vision Systems (S/EVS), for commercial and business aircraft. An emerging thrust in this activity is the development of spatially-integrated, large field-of-regard information display systems. Head-worn or helmet-mounted display systems are being proposed as one method in which to meet this objective. System delays or latencies inherent to spatially-integrated, head-worn displays critically influence the display utility, usability, and acceptability. Research results from three different, yet similar technical areas flight control, flight simulation, and virtual reality are collectively assembled in this paper to create a global perspective of delay or latency effects in head-worn or helmet-mounted display systems. Consistent definitions and measurement techniques are proposed herein for universal application and latency requirements for Head-Worn Display S/EVS applications are drafted. Future research areas are defined

    An Introduction to 3D User Interface Design

    Get PDF
    3D user interface design is a critical component of any virtual environment (VE) application. In this paper, we present a broad overview of three-dimensional (3D) interaction and user interfaces. We discuss the effect of common VE hardware devices on user interaction, as well as interaction techniques for generic 3D tasks and the use of traditional two-dimensional interaction styles in 3D environments. We divide most user interaction tasks into three categories: navigation, selection/manipulation, and system control. Throughout the paper, our focus is on presenting not only the available techniques, but also practical guidelines for 3D interaction design and widely held myths. Finally, we briefly discuss two approaches to 3D interaction design, and some example applications with complex 3D interaction requirements. We also present an annotated online bibliography as a reference companion to this article
    • …
    corecore