5 research outputs found
AR in VR: Simulating augmented reality glass for image fusion
Developing an augmented reality (AR) system involves a multitude of interconnected algorithms such as image fusion, camera synchronization and calibration, and brightness control, each having diverse parameters. This abundance of features, while beneficial in nature for its applicability to different tasks, is detrimental to developers as they try to navigate different combinations and pick the most suitable configuration for their application. Additionally, the temporally inconsistent nature of the real world hinders the development of reproducible and reliable testing methods for AR systems. To help address these issues, we develop and test a virtual reality (VR) environment that allows the simulation of variable AR configurations for image fusion. In this work, we improve our system with a more realistic AR glass model adhering to physical light and glass properties. Our implementation combines the incoming real-world background light and the AR projector light at the level of the AR glass
The role of multisensory feedback in the objective and subjective evaluations of fidelity in virtual reality environments.
The use of virtual reality in academic and industrial research has been rapidly expanding in recent years therefore evaluations of the quality and effectiveness of virtual environments are required. The assessment process is usually done through user evaluation that is being measured whilst the user engages with the system. The limitations of this method in terms of its variability and user bias of pre and post-experience have been recognised in the research literature. Therefore, there is a need to design more objective measures of system effectiveness that could complement subjective measures and provide a conceptual framework for the fidelity assessment in VR. There are many technological and perceptual factors that can influence the overall experience in virtual environments. The focus of this thesis was to investigate how multisensory feedback, provided during VR exposure, can modulate a user’s qualitative and quantitative experience in the virtual environment. In a series of experimental studies, the role of visual, audio, haptic and motion cues on objective and subjective evaluations of fidelity in VR was investigated. In all studies, objective measures of performance were collected and compared to the subjective measures of user perception. The results showed that the explicit evaluation of environmental and perceptual factors available within VR environments modulated user experience. In particular, the results shown that a user’s postural responses can be used as a basis for the objective measure of fidelity. Additionally, the role of augmented sensory cues was investigated during a manual assembly task. By recording and analysing the objective and subjective measures it was shown that augmented multisensory feedback modulated the user’s acceptability of the virtual environment in a positive manner and increased overall task performance. Furthermore, the presence of augmented cues mitigated the negative effects of inaccurate motion tracking and simulation sickness. In the follow up study, the beneficial effects of virtual training with augmented sensory cues were observed in the transfer of learning when the same task was performed in a real environment. Similarly, when the effects of 6 degrees of freedom motion cuing on user experience were investigated in a high fidelity flight simulator, the consistent findings between objective and subjective data were recorded. By measuring the pilot’s accuracy to follow the desired path during a slalom manoeuvre while perceived task demand was increased, it was shown that motion cuing is related to effective task performance and modulates the levels of workload, sickness and presence. The overall findings revealed that multisensory feedback plays an important role in the overall perception and fidelity evaluations of VR systems and as such user experience needs to be included when investigating the effectiveness of sensory feedback signals. Throughout this thesis it was consistently shown that subjective measures of user perception in VR are directly comparable to the objective measures of performance and therefore both should be used in order to obtain a robust results when investigating the effectiveness of VR systems. This conceptual framework can provide an effective method to study human perception, which can in turn provide a deeper understanding of the environmental and cognitive factors that can influence the overall user experience, in terms of fidelity requirements, in virtual reality environments
Recommended from our members
Visualization Authoring for Data-driven Storytelling
Data-driven storytelling is the process of communicating insights and findings that are supported by data, forming a visualization-based narrative. However, most current visualization creation tools either only support fixed sets of designs or require an in-depth understanding of programming concepts. To enable non-programmers to create custom visualizations for data-driven storytelling, we design interactions and implement user interfaces for visualization authoring. In the first part of this dissertation, we introduce and evaluate a series of three visualization authoring tools using traditional user interfaces: (1) iVisDesigner, which uses a data-flow model and enables users to author visualizations by specifying mappings from data to graphics interactively; (2) ChartAccent, a tool for annotating a given visualization; and (3) Charticulator, which allows users to design custom layouts interactively. We then reflect on the evaluation of visualization authoring user interfaces. In the second part of the dissertation, we extend our approach to multiple presentation media or display environments, including traditional 2-dimensional screens, large projection-based virtual-reality (VR) systems, and head-mounted virtual/augmented reality displays (HMDs). To leverage such immersive visualization environments, we ported and extended the iVisDesigner authoring approach to projection-based virtual reality. To facilitate the development of immersive visualizations, we built a visualization library called Stardust, which provides a familiar API to utilize GPU processing power in a cross-platform way. Finally, we present Idyll-MR, a system for authoring data-driven stories in virtual and augmented reality. We evaluated these authoring tools and libraries, and demonstrated high expressiveness, usability, and performance, as well as portability across platforms. In summary, our contributions enable larger audiences to create visual data-driven stories using different presentation media, leading to an overall enriched diversity of visualization designs
The Role of Latency in the Validity of AR Simulation
It is extremely challenging to run controlled studies comparing multiple Augmented Reality (AR) systems. We use an AR simulation approach, in which a Virtual Reality (VR) system is used to simulate multiple AR systems. To investigate the validity of this approach, in our first experiment we carefully replicated a well-known study by Ellis et al. using our simulator, obtaining comparable results. We include a discussion on general issues we encountered with replicating a prior study. In our second experiment further exploring the validity of AR simulation, we investigated the effects of simulator latency on the results from experiments conducted in an AR simulator. We found simulator latency to have a significant effect on 3D tracing, however there was no interaction between simulator latency and artificial latency. Based on the results from these two experiments, we conclude that simulator latency is not inconsequential in determining task performance. Simulating visual registration is not sufficient to simulate the overall perception of registration errors in an AR system. We also need to keep simulator latency at a minimum. We discuss the impact of these results on the use of the AR simulation approach