2,320 research outputs found

    A Virtual Testbed for Fish-Tank Virtual Reality: Improving Calibration with a Virtual-in-Virtual Display

    Get PDF
    With the development of novel calibration techniques for multimedia projectors and curved projection surfaces, volumetric 3D displays are becoming easier and more affordable to build. The basic requirements include a display shape that defines the volume (e.g. a sphere, cylinder, or cuboid) and a tracking system to provide each user's location for the perspective corrected rendering. When coupled with modern graphics cards, these displays are capable of high resolution, low latency, high frame rate, and even stereoscopic rendering; however, like many previous studies have shown, every component must be precisely calibrated for a compelling 3D effect. While human perceptual requirements have been extensively studied for head-tracked displays, most studies featured seated users in front of a flat display. It remains unclear if results from these flat display studies are applicable to newer, walk-around displays with enclosed or curved shapes. To investigate these issues, we developed a virtual testbed for volumetric head-tracked displays that can measure calibration accuracy of the entire system in real-time. We used this testbed to investigate visual distortions of prototype curved displays, improve existing calibration techniques, study the importance of stereo to performance and perception, and validate perceptual calibration with novice users. Our experiments show that stereo is important for task performance, but requires more accurate calibration, and that novice users can make effective use of perceptual calibration tools. We also propose a novel, real-time calibration method that can be used to fine-tune an existing calibration using perceptual feedback. The findings from this work can be used to build better head-tracked volumetric displays with an unprecedented amount of 3D realism and intuitive calibration tools for novice users

    DEPTH PERCEPTION IN VIRTUAL PERIPERSONAL SPACE: AN INVESTIGATION OF MOTION PARALLAX ON PERCEPTION- VS ACTION-ESTIMATIONS

    Get PDF
    The goal of the current experiment was to investigate whether the addition of Motion Parallax will allow participants to make more accurate distance estimations, in both the real and virtual worlds, as well as to determine whether perception- and action-estimations were affected similarly. Due to rising number of COVID-19 cases in 2020, all in-person testing needed to cease with only one participant being tested with the full set of conditions in the final experimental configuration and one participant having been completed the motion parallax conditions only. As a result, the two participants were combined and only the motion parallax conditions were analyzed. Due to low statistical power, no significant main effects, nor significant interactions were discovered. Once the COVID-19 pandemic has subsidised, I am intending to collect data from all twenty-four participants with the full array of conditions in order to complete the current project. An increase in distance-estimation accuracy, especially in virtual reality conditions is still expected to be found

    Adaptive User Perspective Rendering for Handheld Augmented Reality

    Full text link
    Handheld Augmented Reality commonly implements some variant of magic lens rendering, which turns only a fraction of the user's real environment into AR while the rest of the environment remains unaffected. Since handheld AR devices are commonly equipped with video see-through capabilities, AR magic lens applications often suffer from spatial distortions, because the AR environment is presented from the perspective of the camera of the mobile device. Recent approaches counteract this distortion based on estimations of the user's head position, rendering the scene from the user's perspective. To this end, approaches usually apply face-tracking algorithms on the front camera of the mobile device. However, this demands high computational resources and therefore commonly affects the performance of the application beyond the already high computational load of AR applications. In this paper, we present a method to reduce the computational demands for user perspective rendering by applying lightweight optical flow tracking and an estimation of the user's motion before head tracking is started. We demonstrate the suitability of our approach for computationally limited mobile devices and we compare it to device perspective rendering, to head tracked user perspective rendering, as well as to fixed point of view user perspective rendering

    Visual discomfort whilst viewing 3D stereoscopic stimuli

    Get PDF
    3D stereoscopic technology intensifies and heightens the viewer s experience by adding an extra dimension to the viewing of visual content. However, with expansion of this technology to the commercial market concerns have been expressed about the potential negative effects on the visual system, producing viewer discomfort. The visual stimulus provided by a 3D stereoscopic display differs from that of the real world, and so it is important to understand whether these differences may pose a health hazard. The aim of this thesis is to investigate the effect of 3D stereoscopic stimulation on visual discomfort. To that end, four experimental studies were conducted. In the first study two hypotheses were tested. The first hypothesis was that the viewing of 3D stereoscopic stimuli, which are located geometrically beyond the screen on which the images are displayed, would induce adaptation changes in the resting position of the eyes (exophoric heterophoria changes). The second hypothesis was that participants whose heterophoria changed as a consequence of adaptation during the viewing of the stereoscopic stimuli would experience less visual discomfort than those people whose heterophoria did not adapt. In the experiment an increase of visual discomfort change in the 3D condition in comparison with the 2D condition was found. Also, there were statistically significant changes in heterophoria under 3D conditions as compared with 2D conditions. However, there was appreciable variability in the magnitude of this adaptation among individuals, and no correlation between the amount of heterophoria change and visual discomfort change was observed. In the second experiment the two hypotheses tested were based on the vergence-accommodation mismatch theory, and the visual-vestibular mismatch theory. The vergence-accommodation mismatch theory predicts that a greater mismatch between the stimuli to accommodation and to vergence would produce greater symptoms in visual discomfort when viewing in 3D conditions than when viewing in 2D conditions. An increase of visual discomfort change in the 3D condition in comparison with the 2D condition was indeed found; however the magnitude of visual discomfort reported did not correlate with the mismatch present during the watching of 3D stereoscopic stimuli. The visual-vestibular mismatch theory predicts that viewing a stimulus stereoscopically will produce a greater sense of vection than viewing it in 2D. This will increase the conflict between the signals from the visual and vestibular systems, producing greater VIMS (Visually- Induced Motion Sickness) symptoms. Participants did indeed report an increase in motion sickness symptoms in the 3D condition. Furthermore, participants with closer seating positions reported more VIMS than participants sitting farther away whilst viewing 3D stimuli. This suggests that the amount of visual field stimulated during 3D presentation affects VIMS, and is an important factor in terms of viewing comfort. In the study more younger viewers (21 to 39 years old) than older viewers (40 years old and older) reported a greater change in visual discomfort during the 3D condition than the 2D condition. This suggests that the visual system s response to a stimulus, rather than the stimulus itself, is a reason for discomfort. No influence of gender on viewing comfort was found. In the next experiment participants fusion capability, as measured by their fusional reserves, was examined to determine whether this component has an impact on reported discomfort during the watching of movies in the 3D condition versus the 2D condition. It was hypothesised that participants with limited fusional range would experience more visual discomfort than participants with a wide fusion range. The hypothesis was confirmed but only in the case of convergent and not divergent eye movement. This observation illustrates that participants capability to convergence has a significant impact on visual comfort. The aim of the last experiment was to examine responses of the accommodation system to changes in 3D stimulus position and to determine whether discrepancies in these responses (i.e. accommodation overshoot, accommodation undershoot) could account for visual discomfort experienced during 3D stereoscopic viewing. It was found that accommodation discrepancy was larger for perceived forwards movement than for perceived backwards movement. The discrepancy was slightly higher in the group susceptible to visual discomfort than in the group not susceptible to visual discomfort, but this difference was not statistically significant. When considering the research findings as a whole it was apparent that not all participants experienced more discomfort whilst watching 3D stereoscopic stimuli than whilst watching 2D stimuli. More visual discomfort in the 3D condition than in the 2D condition was reported by 35% of the participants, whilst 24% of the participants reported more headaches and 17% of the participants reported more VIMS. The research indicates that multiple causative factors have an impact on reported symptoms. The analysis of the data suggests that discomfort experienced by people during 3D stereoscopic stimulation may reveal binocular vision problems. This observation suggests that 3D technology could be used as a screening method to diagnose un-treated binocular vision disorder. Additionally, this work shows that 3D stereoscopic technology can be easily adopted to binocular vision measurement. The conclusion of this thesis is that many people do not suffer adverse symptoms when viewing 3D stereoscopic displays, but that if adverse symptoms are present they can be caused either by the conflict in the stimulus, or by the heightened experience of self-motion which leads to Visually-Induced Motion Sickness (VIMS)

    Novel haptic interface For viewing 3D images

    Get PDF
    In recent years there has been an explosion of devices and systems capable of displaying stereoscopic 3D images. While these systems provide an improved experience over traditional bidimensional displays they often fall short on user immersion. Usually these systems only improve depth perception by relying on the stereopsis phenomenon. We propose a system that improves the user experience and immersion by having a position dependent rendering of the scene and the ability to touch the scene. This system uses depth maps to represent the geometry of the scene. Depth maps can be easily obtained on the rendering process or can be derived from the binocular-stereo images by calculating their horizontal disparity. This geometry is then used as an input to be rendered in a 3D display, do the haptic rendering calculations and have a position depending render of the scene. The author presents two main contributions. First, since the haptic devices have a finite work space and limited resolution, we used what we call detail mapping algorithms. These algorithms compress geometry information contained in a depth map, by reducing the contrast among pixels, in such a way that it can be rendered into a limited resolution display medium without losing any detail. Second, the unique combination of a depth camera as a motion capturing system, a 3D display and haptic device to enhance user experience. While developing this system we put special attention on the cost and availability of the hardware. We decided to use only off-the-shelf, mass consumer oriented hardware so our experiments can be easily implemented and replicated. As an additional benefit the total cost of the hardware did not exceed the one thousand dollars mark making it affordable for many individuals and institutions

    Multi-touch 3D Exploratory Analysis of Ocean Flow Models

    Get PDF
    Modern ocean flow simulations are generating increasingly complex, multi-layer 3D ocean flow models. However, most researchers are still using traditional 2D visualizations to visualize these models one slice at a time. Properly designed 3D visualization tools can be highly effective for revealing the complex, dynamic flow patterns and structures present in these models. However, the transition from visualizing ocean flow patterns in 2D to 3D presents many challenges, including occlusion and depth ambiguity. Further complications arise from the interaction methods required to navigate, explore, and interact with these 3D datasets. We present a system that employs a combination of stereoscopic rendering, to best reveal and illustrate 3D structures and patterns, and multi-touch interaction, to allow for natural and efficient navigation and manipulation within the 3D environment. Exploratory visual analysis is facilitated through the use of a highly-interactive toolset which leverages a smart particle system. Multi-touch gestures allow users to quickly position dye emitting tools within the 3D model. Finally, we illustrate the potential applications of our system through examples of real world significance

    Development of Immersive and Interactive Virtual Reality Environment for Two-Player Table Tennis

    Get PDF
    Although the history of Virtual Reality (VR) is only about half a century old, all kinds of technologies in the VR field are developing rapidly. VR is a computer generated simulation that replaces or augments the real world by various media. In a VR environment, participants have a perception of “presence”, which can be described by the sense of immersion and intuitive interaction. One of the major VR applications is in the field of sports, in which a life-like sports environment is simulated, and the body actions of players can be tracked and represented by using VR tracking and visualisation technology. In the entertainment field, exergaming that merges video game with physical exercise activities by employing tracking or even 3D display technology can be considered as a small scale VR. For the research presented in this thesis, a novel realistic real-time table tennis game combining immersive, interactive and competitive features is developed. The implemented system integrates the InterSense tracking system, SwissRanger 3D camera and a three-wall rear projection stereoscopic screen. The Intersense tracking system is based on ultrasonic and inertia sensing techniques which provide fast and accurate 6-DOF (i.e. six degrees of freedom) tracking information of four trackers. Two trackers are placed on the two players’ heads to provide the players’ viewing positions. The other two trackers are held by players as the racquets. The SwissRanger 3D camera is mounted on top of the screen to capture the player’
    • …
    corecore