401 research outputs found

    Assessing the Zone of Comfort in Stereoscopic Displays using EEG

    Get PDF
    The conflict between vergence (eye movement) and accommodation (crystalline lens deformation) occurs in every stereoscopic display. It could cause important stress outside the "zone of comfort", when stereoscopic effect is too strong. This conflict has already been studied using questionnaires, during viewing sessions of several minutes. The present pilot study describes an experimental protocol which compares two different comfort conditions using electroencephalography (EEG) over short viewing sequences. Analyses showed significant differences both in event-related potentials (ERP) and in frequency bands power. An uncomfortable stereoscopy correlates with a weaker negative component and a delayed positive component in ERP. It also induces a power decrease in the alpha band and increases in theta and beta bands. With fast responses to stimuli, EEG is likely to enable the conception of adaptive systems, which could tune the stereoscopic experience according to each viewer

    A Neurophysiologic Study Of Visual Fatigue In Stereoscopic Related Displays

    Get PDF
    Two tasks were investigated in this study. The first study investigated the effects of alignment display errors on visual fatigue. The experiment revealed the following conclusive results: First, EEG data suggested the possibility of cognitively-induced time compensation changes due to a corresponding effect in real-time brain activity by the eyes trying to compensate for the alignment. The magnification difference error showed more significant effects on all EEG band waves, which were indications of likely visual fatigue as shown by the prevalence of simulator sickness questionnaire (SSQ) increases across all task levels. Vertical shift errors were observed to be prevalent in theta and beta bands of EEG which probably induced alertness (in theta band) as a result of possible stress. Rotation errors were significant in the gamma band, implying the likelihood of cognitive decline because of theta band influence. Second, the hemodynamic responses revealed that significant differences exist between the left and right dorsolateral prefrontal due to alignment errors. There was also a significant difference between the main effect for power band hemisphere and the ATC task sessions. The analyses revealed that there were significant differences between the dorsal frontal lobes in task processing and interaction effects between the processing lobes and tasks processing. The second study investigated the effects of cognitive response variables on visual fatigue. Third, the physiologic indicator of pupil dilation was 0.95mm that occurred at a mean time of 38.1min, after which the pupil dilation begins to decrease. After the average saccade rest time of 33.71min, saccade speeds leaned toward a decrease as a possible result of fatigue on-set. Fourth, the neural network classifier showed visual response data from eye movement were identified as the best predictor of visual fatigue with a classification accuracy of 90.42%. Experimental data confirmed that 11.43% of the participants actually experienced visual fatigue symptoms after the prolonged task

    Toward Simulation-Based Training Validation Protocols: Exploring 3d Stereo with Incremental Rehearsal and Partial Occlusion to Instigate and Modulate Smooth Pursuit and Saccade Responses in Baseball Batting

    Get PDF
    “Keeping your eye on the ball” is a long-standing tenet in baseball batting. And yet, there are no protocols for objectively conditioning, measuring, and/or evaluating eye-on-ball coordination performance relative to baseball-pitch trajectories. Although video games and other virtual simulation technologies offer alternatives for training and obtaining objective measures, baseball batting instruction has relied on traditional eye-pitch coordination exercises with qualitative “face validation”, statistics of whole-task batting performance, and/or subjective batter-interrogation methods, rather than on direct, quantitative eye-movement performance evaluations. Further, protocols for validating transfer-of-training (ToT) for video games and other simulation-based training have not been established in general ― or for eye-movement training, specifically. An exploratory research study was conducted to consider the ecological and ToT validity of a part-task, virtual-fastball simulator implemented in 3D stereo along with a rotary pitching machine standing as proxy for the live-pitch referent. The virtual-fastball and live-pitch simulation couple was designed to facilitate objective eye-movement response measures to live and virtual stimuli. The objective measures 1) served to assess the ecological validity of virtual fastballs, 2) informed the characterization and comparison of eye-movement strategies employed by expert and novice batters, 3) enabled a treatment protocol relying on repurposed incremental-rehearsal and partial-occlusion methods intended to instigate and modulate strategic eye movements, and 4) revealed whether the simulation-based treatment resulted in positive (or negative) ToT in the real task. Results indicated that live fastballs consistently elicited different saccade onset time responses than virtual fastballs. Saccade onset times for live fastballs were consistent with catch-up saccades that follow the smooth-pursuit maximum velocity threshold of approximately 40-70˚/sec while saccade onset times for virtual fastballs lagged in the order of 13%. More experienced batters employed more deliberate and timely combinations of smooth pursuit and catch-up saccades than less experienced batters, enabling them to position their eye to meet the ball near the front edge of home plate. Smooth pursuit and saccade modulation from treatment was inconclusive from virtual-pitch pre- and post-treatment comparisons, but comparisons of live-pitch pre- and post-treatment indicate ToT improvements. Lagging saccade onset times from virtual-pitch suggest possible accommodative-vergence impairment due to accommodation-vergence conflict inherent to 3D stereo displays

    Psychophysiology-based QoE assessment : a survey

    Get PDF
    We present a survey of psychophysiology-based assessment for quality of experience (QoE) in advanced multimedia technologies. We provide a classification of methods relevant to QoE and describe related psychological processes, experimental design considerations, and signal analysis techniques. We summarize multimodal techniques and discuss several important aspects of psychophysiology-based QoE assessment, including the synergies with psychophysical assessment and the need for standardized experimental design. This survey is not considered to be exhaustive but serves as a guideline for those interested to further explore this emerging field of research

    Optical Gaze Tracking with Spatially-Sparse Single-Pixel Detectors

    Get PDF
    Gaze tracking is an essential component of next generation displays for virtual reality and augmented reality applications. Traditional camera-based gaze trackers used in next generation displays are known to be lacking in one or multiple of the following metrics: power consumption, cost, computational complexity, estimation accuracy, latency, and form-factor. We propose the use of discrete photodiodes and light-emitting diodes (LEDs) as an alternative to traditional camera-based gaze tracking approaches while taking all of these metrics into consideration. We begin by developing a rendering-based simulation framework for understanding the relationship between light sources and a virtual model eyeball. Findings from this framework are used for the placement of LEDs and photodiodes. Our first prototype uses a neural network to obtain an average error rate of 2.67{\deg} at 400Hz while demanding only 16mW. By simplifying the implementation to using only LEDs, duplexed as light transceivers, and more minimal machine learning model, namely a light-weight supervised Gaussian process regression algorithm, we show that our second prototype is capable of an average error rate of 1.57{\deg} at 250 Hz using 800 mW.Comment: 10 pages, 8 figures, published in IEEE International Symposium on Mixed and Augmented Reality (ISMAR) 202

    Ocular biomechanics modelling for visual fatigue assessment in virtual environments

    Full text link
    The study objectively quantifies visual fatigue caused by immersion in virtual reality. Visual fatigue assessment is done through ocular biomechanics modelling and eye tracking to analyse eye movement and muscle forces into a visual fatigue index

    Real-Time Control of a Video Game Using Eye Movements and Two Temporal EEG Sensors

    Get PDF
    EEG-controlled gaming applications range widely from strictly medical to completely nonmedical applications. Games can provide not only entertainment but also strong motivation for practicing, thereby achieving better control with rehabilitation system. In this paper we present real-time control of video game with eye movements for asynchronous and noninvasive communication system using two temporal EEG sensors. We used wavelets to detect the instance of eye movement and time-series characteristics to distinguish between six classes of eye movement. A control interface was developed to test the proposed algorithm in real-time experiments with opened and closed eyes. Using visual feedback, a mean classification accuracy of 77.3% was obtained for control with six commands. And a mean classification accuracy of 80.2% was obtained using auditory feedback for control with five commands. The algorithm was then applied for controlling direction and speed of character movement in two-dimensional video game. Results showed that the proposed algorithm had an efficient response speed and timing with a bit rate of 30 bits/min, demonstrating its efficacy and robustness in real-time control

    A review on ocular biomechanic models for assessing visual fatigue in virtual reality

    Full text link
    With the wide spread of affordable virtual reality headsets, virtual environments are rapidly changing the way humans interact with reality. Understanding the effects of virtual environments on the mental and cognitive state is essential. In addition, defining methods for measuring and assessing visual fatigue in virtual environments is still needed. While eye movements are tightly coupled to the mental state, analysis of eye movement can add insights for safer virtual environments. Biomechanical analysis has been used extensively in the analysis of human movement. Simulation of different scenarios such as injuries and surgeries provided insights and solutions to problems that were otherwise impossible. This includes understanding the effects of changing insertion points of muscle on range of motion or how muscle activation can affect the motion produced. Extending the use of biomechanical simulation analysis into eye movement can be used to deepen our understanding of how virtual environments affect our visual and mental capabilities. This paper presents a thorough review on ocular biomechanics and ocular models in literature. We start with a brief introduction on the anatomy of the eye and eye kinematics. In addition, properties of the extraocular muscles (EOM) are described and the difference between EOMs and skeletal muscle is highlighted. The challenges facing biomechanical simulation and analysis of eye movement are presented along with the role of ocular models in assessing visual fatigue. Furthermore, the compatibility of available biomechanical tools to analyze ocular movements is discussed

    Emerging ExG-based NUI Inputs in Extended Realities : A Bottom-up Survey

    Get PDF
    Incremental and quantitative improvements of two-way interactions with extended realities (XR) are contributing toward a qualitative leap into a state of XR ecosystems being efficient, user-friendly, and widely adopted. However, there are multiple barriers on the way toward the omnipresence of XR; among them are the following: computational and power limitations of portable hardware, social acceptance of novel interaction protocols, and usability and efficiency of interfaces. In this article, we overview and analyse novel natural user interfaces based on sensing electrical bio-signals that can be leveraged to tackle the challenges of XR input interactions. Electroencephalography-based brain-machine interfaces that enable thought-only hands-free interaction, myoelectric input methods that track body gestures employing electromyography, and gaze-tracking electrooculography input interfaces are the examples of electrical bio-signal sensing technologies united under a collective concept of ExG. ExG signal acquisition modalities provide a way to interact with computing systems using natural intuitive actions enriching interactions with XR. This survey will provide a bottom-up overview starting from (i) underlying biological aspects and signal acquisition techniques, (ii) ExG hardware solutions, (iii) ExG-enabled applications, (iv) discussion on social acceptance of such applications and technologies, as well as (v) research challenges, application directions, and open problems; evidencing the benefits that ExG-based Natural User Interfaces inputs can introduceto the areaof XR.Peer reviewe
    • …
    corecore