416 research outputs found

    Cybersickness in Virtual Reality Questionnaire (CSQ-VR):A validation and comparison against SSQ and VRSQ

    Get PDF
    Cybersickness is a drawback of virtual reality (VR), which also affects the cognitive and motor skills of the users. The Simulator Sickness Questionnaire (SSQ), and its variant, the Virtual Reality Sickness Questionnaire (VRSQ) are two tools that measure cybersickness. However, both tools suffer from important limitations, which raises concerns about their suitability. Two versions of the Cybersickness in VR Questionnaire (CSQ-VR), a paper-and-pencil and a 3D –VR version, were developed. Validation and comparison of CSQ-VR against SSQ and VRSQ were performed. Thirty-nine participants were exposed to 3 rides with linear and angular accelerations in VR. Assessments of cognitive and psychomotor skills were performed at baseline and after each ride. The validity of both versions of CSQ_VR was confirmed. Notably, CSQ-VR demonstrated substantially better internal consistency than both SSQ and VRSQ. Also, CSQ-VR scores had significantly better psychometric properties in detecting a temporary decline in performance due to cybersickness. Pupil size was a significant predictor of cybersickness intensity. In conclusion, the CSQ-VR is a valid assessment of cybersickness, with superior psychometric properties to SSQ and VRSQ. The CSQ-VR enables the assessment of cybersickness during VR exposure, and it benefits from examining pupil size, a biomarker of cybersickness.  </p

    Interactive Feedforward in High Intensity VR Exergaming

    Get PDF

    Biosignalų požymių regos diskomfortui vertinti išskyrimas ir tyrimas

    Get PDF
    Comfortable stereoscopic perception continues to be an essential area of research. The growing interest in virtual reality content and increasing market for head-mounted displays (HMDs) still cause issues of balancing depth perception and comfortable viewing. Stereoscopic views are stimulating binocular cues – one type of several available human visual depth cues which becomes conflicting cues when stereoscopic displays are used. Depth perception by binocular cues is based on matching of image features from one retina with corresponding features from the second retina. It is known that our eyes can tolerate small amounts of retinal defocus, which is also known as Depth of Focus. When magnitudes are larger, a problem of visual discomfort arises. The research object of the doctoral dissertation is a visual discomfort level. This work aimed at the objective evaluation of visual discomfort, based on physiological signals. Different levels of disparity and the number of details in stereoscopic views in some cases make it difficult to find the focus point for comfortable depth perception quickly. During this investigation, a tendency for differences in single sensor-based electroencephalographic EEG signal activity at specific frequencies was found. Additionally, changes in eye tracker collected gaze signals were also found. A dataset of EEG and gaze signal records from 28 control subjects was collected and used for further evaluation. The dissertation consists of an introduction, three chapters and general conclusions. The first chapter reveals the fundamental knowledge ways of measuring visual discomfort based on objective and subjective methods. In the second chapter theoretical research results are presented. This research was aimed to investigate methods which use physiological signals to detect changes on the level of sense of presence. Results of the experimental research are presented in the third chapter. This research aimed to find differences in collected physiological signals when a level of visual discomfort changes. An experiment with 28 control subjects was conducted to collect these signals. The results of the thesis were published in six scientific publications – three in peer-reviewed scientific papers, three in conference proceedings. Additionally, the results of the research were presented in 8 conferences.Dissertatio

    DeepMetricEye: Metric Depth Estimation in Periocular VR Imagery

    Full text link
    Despite the enhanced realism and immersion provided by VR headsets, users frequently encounter adverse effects such as digital eye strain (DES), dry eye, and potential long-term visual impairment due to excessive eye stimulation from VR displays and pressure from the mask. Recent VR headsets are increasingly equipped with eye-oriented monocular cameras to segment ocular feature maps. Yet, to compute the incident light stimulus and observe periocular condition alterations, it is imperative to transform these relative measurements into metric dimensions. To bridge this gap, we propose a lightweight framework derived from the U-Net 3+ deep learning backbone that we re-optimised, to estimate measurable periocular depth maps. Compatible with any VR headset equipped with an eye-oriented monocular camera, our method reconstructs three-dimensional periocular regions, providing a metric basis for related light stimulus calculation protocols and medical guidelines. Navigating the complexities of data collection, we introduce a Dynamic Periocular Data Generation (DPDG) environment based on UE MetaHuman, which synthesises thousands of training images from a small quantity of human facial scan data. Evaluated on a sample of 36 participants, our method exhibited notable efficacy in the periocular global precision evaluation experiment, and the pupil diameter measurement

    RCEA-360VR: Real-time, continuous emotion annotation in 360◦ VR videos for collecting precise viewport-dependent ground truth labels

    Get PDF
    Precise emotion ground truth labels for 360◦ virtual reality (VR) video watching are essential for fne-grained predictions under varying viewing behavior. However, current annotation techniques either rely on post-stimulus discrete self-reports, or real-time, con- tinuous emotion annotations (RCEA) but only for desktop/mobile settings. We present RCEA for 360◦ VR videos (RCEA-360VR), where we evaluate in a controlled study (N=32) the usability of two peripheral visualization techniques: HaloLight and DotSize. We furthermore develop a method that considers head movements when fusing labels. Using physiological, behavioral, and subjective measures, we show that (1) both techniques do not increase users’ workload, sickness, nor break presence (2) our continuous valence and arousal annotations are consistent with discrete within-VR and original stimuli ratings (3) users exhibit high similarity in viewing behavior, where fused ratings perfectly align with intended labels. Our work contributes usable and efective techniques for collecting fne-grained viewport-dependent emotion labels in 360◦ VR

    Stress Reduction Using Bilateral Stimulation in Virtual Reality

    Get PDF
    The goal of this research is to integrate Virtual Reality (VR) with the bilateral stimulation used in EMDR as a tool to relieve stress. We created a 15 minutes relaxation training program for adults in a virtual, relaxing environment in form of a walk in the woods. The target platform for the tool is HTC Vive, however it can be easily ported to other VR platforms. An integral part of this tool is a set of sensors, which serves as physiological measures to evaluate the effectiveness of such system. What is more, the system integrate visual (passing sphere), auditory (surround sound) and tactile signals (vibration of controllers). A pilot treatment programme, incorporating the above mentioned VR system, was carried out. Experimental group consisting of 28 healthy adult volunteers (office workers), participated in three different sessions of relaxation training. Before starting, baseline features such as subjectively perceived stress, mood, heart rate, galvanic skin response and muscle response were registered. The monitoring of physiological indicators is continued during the training session and one minute after its completion. Before and after the session, volunteers were asked to re-fill questionnaires regarding the current stress level and mood. The obtained results were analyzed in terms of variability over time: before, during and after the session

    Watching Androids Dream of Electric Sheep: Immersive Technology, Biometric Psychography, and the Law

    Get PDF
    Virtual reality and augmented reality present exceedingly complex privacy issues because of the enhanced user experience and reality-based models. Unlike the issues presented by traditional gaming and social media, immersive technology poses inherent risks, which our legal understanding of biometrics and online harassment is simply not prepared to address. This Article offers five important contributions to this emerging space. It begins by introducing a new area of legal and policy inquiry raised by immersive technology called “biometric psychography.” Second, it explains how immersive technology works to a legal audience and defines concepts that are essential to understanding the risks that the technology poses. Third, it analyzes the gaps in privacy law to address biometric psychography and other emerging challenges raised by immersive technology that most regulators and consumers incorrectly assume will be governed by existing law. Fourth, this Article sources firsthand interviews from early innovators and leading thinkers to highlight harassment and user experience risks posed by immersive technology. Finally, this Article compiles insights from each of these discussions to propose a framework that integrates privacy and human rights into the development of future immersive tech applications. It applies that framework to three specific scenarios and demonstrates how it can help navigate challenges, both old and new

    Beyond the screen – The potential of smartphone apps and immersive technologies in exposure-based interventions for phobias

    Get PDF
    Specific phobias are extremely common among adults. They are characterized by strong emotional reactions and avoidance behavior when exposed to the feared stimuli. Specifically fears concerning heights or animals such as spiders are highly prevalent, followed by fear of social situations such as fear of public speaking. The gold standard in treating specific phobias is exposure-based therapy. However, exposure-based therapy is limited in its practicability in clinical routine and poses a high hurdle for affected individuals. Virtual and augmented reality (VR/AR) smartphone apps offer attractive platforms to simulate exposure situations and by that increase the accessibility of mental health services in general. Thus, novel smartphone-based treatments hold the potential to facilitate the dissemination of exposure-based treatments for specific phobias. The studies presented as part of this thesis aimed at investigating three newly developed interventions for fear of heights, fear of public speaking and fear of spiders, using the currently available advanced technologies. In the first study (Bentz et al., 2021), a stand-alone, automated and gamified VR exposure app Easyheights was developed using 360° images. The app’s effectiveness to reduce fear of heights and avoidance behavior was investigated in a randomized controlled trial in an adult population with clinical and subclinical fear of heights. The repeated use of the app led to reduced fear and avoidance behavior in a real-life situation on a tower. For the second study (Müller, Fehlmann et al., 2022), the developed stand-alone, automated and gamified VR exposure app Fearless Speech aimed at reducing public speaking anxiety (PSA) and avoidance of eye contact. A virtual audience with 360° videos was used for the exposure and gaze control for the eye contact training. The app was investigated in a randomized controlled trial in healthy adults with subclinical PSA. After the repeated use of the app, participants showed reduced fear and improved eye contact in a real-life speech situation. The third study (Zimmer et al., 2021) examined the developed stand-alone, automated and gamified AR exposure app Phobys. In comparison to VR, AR has only recently been introduced to clinical research. The app was designed to reduce fear, disgust and avoidance behavior in adults with clinical and subclinical fear of spiders. The results of the randomized controlled trial showed that repeatedly using the app led to reduced fear, disgust and avoidance behavior in a real-life situation with a real spider. The results of these studies support the potential of stand-alone, automated VR and AR interventions delivered through smartphone apps. The developed apps allow for a high-quality user experience with a highly realistic environment, gaze control for an easy navigation as well as the possibility of interaction. In addition, gamification elements foster engagement with the apps. All three investigated apps offer low-threshold and low-cost treatment for individuals affected by specific phobias. Testing the effectiveness of these newly developed apps in real-life settings sets them apart from previous studies. Hence, this thesis highlights the potential of using smartphone apps with immersive technologies to advance and disseminate exposure-based treatments for specific phobias

    Fear behavior in virtual reality

    Get PDF
    corecore