7,021 research outputs found

    Which Ocular Dominance Should Be Considered for Monocular Augmented Reality Devices?

    Get PDF
    A monocular augmented reality device allows the user to see information that is superimposed on the environment. As it does not stimulate both eyes in the same way, it creates a phenomenon known as binocular rivalry. The question therefore arises as to whether monocular information should be displayed to a particular eye and if an ocular dominance test can determine it. This paper contributes to give a better understanding of ocular dominance by comparing nine tests. Our results suggest that ocular dominance can be divided into sighting and sensorial dominance. However, different sensorial dominance tests give different results, suggesting that it is composed of distinct components that are assessed by different tests. There is a need for a comprehensive test that can consider all of these components, in order to identify on which eye monocular information should be directed to when using monocular augmented reality devices

    Minified Augmented Reality as a Terrestrial Analog for G-Transitions Effects in Lunar and Interplanetary Spaceflight

    Get PDF
    Brief periods of extreme gravitational transition are anticipated during interplanetary spaceflight, including transitions between microgravity, hypogravity, and hypergravity. Rapid sensorimotor adaptation will occur following these G-transitions which may affect astronaut performance including gaze control and dynamic visual acuity. Significant decrements in dynamic visual acuity could lead to mission compromise or failure (e.g., impairing mission critical tasks or spacecraft maneuvering). It is crucial to provide astronauts with the training necessary to overcome these physiological barriers they are bound to encounter during spaceflight missions. Minifying lenses in augmented reality may serve as an easily applicable, low-cost, method to simulate vestibulo-ocular dysfunction that occurs during gravitational transitions. In this paper, we review the effects of G-transitions on the vestibulo-ocular system and report on the novel development of minified augmented reality as a potential simulator and training tool for future spaceflight. We also report the results from an early validation study with a mean decrease in DVA (0.370 LogMAR) with 80% minifying effect, as well as a mean increase of DVA post-minifying (0.030 LogMAR). These early results suggest that minified augmented reality may serve as an accessible terrestrial analog for G-transitions, thus having potential for pre-mission training, particularly for lunar and interplanetary spaceflight

    An automated calibration method for non-see-through head mounted displays

    Get PDF
    Accurate calibration of a head mounted display (HMD) is essential both for research on the visual system and for realistic interaction with virtual objects. Yet, existing calibration methods are time consuming and depend on human judgements, making them error prone, and are often limited to optical see-through HMDs. Building on our existing approach to HMD calibration Gilson et al. (2008), we show here how it is possible to calibrate a non-see-through HMD. A camera is placed inside a HMD displaying an image of a regular grid, which is captured by the camera. The HMD is then removed and the camera, which remains fixed in position, is used to capture images of a tracked calibration object in multiple positions. The centroids of the markers on the calibration object are recovered and their locations re-expressed in relation to the HMD grid. This allows established camera calibration techniques to be used to recover estimates of the HMD display's intrinsic parameters (width, height, focal length) and extrinsic parameters (optic centre and orientation of the principal ray). We calibrated a HMD in this manner and report the magnitude of the errors between real image features and reprojected features. Our calibration method produces low reprojection errors without the need for error-prone human judgements

    A system for synthetic vision and augmented reality in future flight decks

    Get PDF
    Rockwell Science Center is investigating novel human-computer interaction techniques for enhancing the situational awareness in future flight decks. One aspect is to provide intuitive displays that provide the vital information and the spatial awareness by augmenting the real world with an overlay of relevant information registered to the real world. Such Augmented Reality (AR) techniques can be employed during bad weather scenarios to permit flying in Visual Flight Rules (VFR) in conditions which would normally require Instrumental Flight Rules (IFR). These systems could easily be implemented on heads-up displays (HUD). The advantage of AR systems vs. purely synthetic vision (SV) systems is that the pilot can relate the information overlay to real objects in the world, whereas SV systems provide a constant virtual view, where inconsistencies can hardly be detected. The development of components for such a system led to a demonstrator implemented on a PC. A camera grabs video images which are overlaid with registered information. Orientation of the camera is obtained from an inclinometer and a magnetometer; position is acquired from GPS. In a possible implementation in an airplane, the on-board attitude information can be used for obtaining correct registration. If visibility is sufficient, computer vision modules can be used to fine-tune the registration by matching visual cues with database features. This technology would be especially useful for landing approaches. The current demonstrator provides a frame-rate of 15 fps, using a live video feed as background with an overlay of avionics symbology in the foreground. In addition, terrain rendering from a 1 arc sec. digital elevation model database can be overlaid to provide synthetic vision in case of limited visibility. For true outdoor testing (on ground level), the system has been implemented on a wearable computer

    Development of a dynamic virtual reality model of the inner ear sensory system as a learning and demonstrating tool

    Get PDF
    In order to keep track of the position and motion of our body in space, nature has given us a fascinating and very ingenious organ, the inner ear. Each inner ear includes five biological sensors - three angular and two linear accelerometers - which provide the body with the ability to sense angular and linear motion of the head with respect to inertial space. The aim of this paper is to present a dynamic virtual reality model of these sensors. This model, implemented in Matlab/Simulink, simulates the rotary chair testing which is one of the tests carried out during a diagnosis of the vestibular system. High-quality 3D-animations linked to the Simulink model are created using the export of CAD models into Virtual Reality Modeling Language (VRML) files. This virtual environment shows not only the test but also the state of each sensor (excited or inhibited) in real time. Virtual reality is used as a tool of integrated learning of the dynamic behavior of the inner ear using ergonomic paradigm of user interactivity (zoom, rotation, mouse interaction,…). It can be used as a learning and demonstrating tool either in the medicine field - to understand the behavior of the sensors during any kind of motion - or in the aeronautical field to relate the inner ear functioning to some sensory illusions

    Laser in Clinical Ophthalmology: Possible Applications, Limitations, and Hazards

    Get PDF
    The present status of laser application in clinical ophthalmology is discussed. The differences between conventional light coagulator characteristics and those of presently available ruby lasers for clinical use are compared. The limitations and hazards of laser therapy are stressed

    MMP-3 deficiency alleviates endotoxin-induced acute inflammation in the posterior eye segment

    Get PDF
    Matrix metalloproteinase-3 (MMP-3) is known to mediate neuroinflammatory processes by activating microglia, disrupting blood-central nervous system barriers and supporting neutrophil influx into the brain. In addition, the posterior part of the eye, more specifically the retina, the retinal pigment epithelium (RPE) and the blood-retinal barrier, is affected upon neuroinflammation, but a role for MMP-3 during ocular inflammation remains elusive. We investigated whether MMP-3 contributes to acute inflammation in the eye using the endotoxin-induced uveitis (EIU) model. Systemic administration of lipopolysaccharide induced an increase in MMP-3 mRNA and protein expression level in the posterior part of the eye. MMP-3 deficiency or knockdown suppressed retinal leukocyte adhesion and leukocyte infiltration into the vitreous cavity in mice subjected to EIU. Moreover, retinal and RPE mRNA levels of intercellular adhesion molecule 1 (Icam1), interleukin 6 (Il6), cytokine-inducible nitrogen oxide synthase (Nos2) and tumor necrosis factor alpha (Tnf alpha), which are key molecules involved in EIU, were clearly reduced in MMP-3 deficient mice. In addition, loss of MMP-3 repressed the upregulation of the chemokines monocyte chemoattractant protein (MCP)-1 and (C-X-C motif) ligand 1 (CXCL1). These findings suggest a contribution of MMP-3 during EIU, and its potential use as a therapeutic drug target in reducing ocular inflammation

    Stroboscopic Augmented Reality as an Approach to Mitigate Gravitational Transition Effects During Interplanetary Spaceflight

    Get PDF
    During interplanetary spaceflight, periods of extreme gravitational transitions will occur such as transitions between hypergravity, hypogravity, and microgravity. Following gravitational transitions, rapid sensorimotor adaptation or maladaptation may occur which can affect gaze control and weaken dynamic visual acuity in astronauts. A reduction in dynamic visual acuity during spaceflight could possibly impact or impair mission critical activities (e.g., control of extraterrestrial machinery/vehicles and other important tasks). Stroboscopic visual training is an emerging visual tool that has been terrestrially observed to enhance visual performance and perception by performing tasks under conditions of intermittent vision. This technique has also been seen to increase the dynamic visual acuity for individuals terrestrially. To mitigate the decreased dynamic visual acuity that is observed in astronauts following gravitational transitions, stroboscopic vision training may serve as a potential countermeasure. We describe the effects of gravitational transitions on the vestibulo-ocular system and dynamic visual acuity, review terrestrial stroboscopic visual training, and report the novel development of stroboscopic augmented reality as a possible countermeasure for G-transitions in future spaceflight
    • …
    corecore