3,224 research outputs found

    The visual standards for the selection and retention of astronauts

    Get PDF
    Literature search with abstracts on visual performance standards for selection and retention of astronaut

    Human operator performance of remotely controlled tasks: Teleoperator research conducted at NASA's George C. Marshal Space Flight Center

    Get PDF
    The capabilities within the teleoperator laboratories to perform remote and teleoperated investigations for a wide variety of applications are described. Three major teleoperator issues are addressed: the human operator, the remote control and effecting subsystems, and the human/machine system performance results for specific teleoperated tasks

    Conceptual design study for a teleoperator visual system, phase 2

    Get PDF
    An analysis of the concept for the hybrid stereo-monoscopic television visual system is reported. The visual concept is described along with the following subsystems: illumination, deployment/articulation, telecommunications, visual displays, and the controls and display station

    Should a movie have two different soundtracks for its stereoscopic and non-stereoscopic versions? A study on the front/rear balance

    No full text
    Best paper awardInternational audienceFew psychoacoustic studies have been made on the influence of stereoscopy on the sound mixing of movies. Yet very different opinions can be found among scientific, esthetical or technical communities. Some argue that sound needs to be mixed differently for stereoscopic movies, whereas others pretend that image has actually caught up with sound, that was already "three-dimensional" and should not therefore be affected by stereoscopy. In the present experiment, expert subjects were asked to achieve surround sound ambiance mixings for eleven short sequences presented in both stereoscopic and nonstereoscopic versions. The results suggest that the influence of stereoscopy on the front/rear balance strongly depends on the content of the sequence and only appears in a few specific situations

    Visualization and Analysis Tools for Neuronal Tissue

    Get PDF
    The complex nature of neuronal cellular and circuit structure poses challenges for understanding tissue organization. New techniques in electron microscopy allow for large datasets to be acquired from serial sections of neuronal tissue. These techniques reveal all cells in an unbiased fashion, so their segmentation produces complex structures that must be inspected and analyzed. Although several software packages provide 3D representations of these structures, they are limited to monoscopic projection, and are tailored to the visualization of generic 3D data. On the other hand, stereoscopic display has been shown to improve the immersive experience, with significant gains in understanding spatial relationships and identifying important features. To leverage those benefits, we have developed a 3D immersive virtual reality data display system that besides presenting data visually allows augmenting and interacting with them in a form that facilitates human analysis.;To achieve a useful system for neuroscientists, we have developed the BrainTrek system, which is a suite of software applications suited for the organization, rendering, visualization, and modification of neuron model scenes. A middle cost point CAVE system provides high vertex count rendering of an immersive 3D environment. A standard head- and wand-tracking allows movement control and modification of the scene via the on-screen, 3D menu, while a tablet touch screen provides multiple navigation modes and a 2D menu. Graphic optimization provides theoretically limitless volumes to be presented and an on-screen mini-map allows users to quickly orientate themselves. A custom voice note-taking mechanism has been installed, allowing scenes to be described and revisited. Finally, ray-casting support allows numerous analytical features, including 3D distance and volume measurements, computation and presentation of statistics, and point-and-click retrieval and presentation of raw electron microscopy data. The extension of this system to the Unity3D platform provides a low-cost alternative to the CAVE. This allows users to visualize, explore, and annotate 3D cellular data in multiple platforms and modalities, ranging from different operating systems, different hardware platforms (e.g., tablets, PCs, or stereo head-mounted displays), to operating in an online or off-line fashion. Such approach has the potential to not only address visualization and analysis needs of neuroscientists, but also to become a tool for educational purposes, as well as for crowdsourcing upcoming needs for sheer amounts of neuronal data annotation

    Mitigation Of Motion Sickness Symptoms In 360 Degree Indirect Vision Systems

    Get PDF
    The present research attempted to use display design as a means to mitigate the occurrence and severity of symptoms of motion sickness and increase performance due to reduced “general effects” in an uncoupled motion environment. Specifically, several visual display manipulations of a 360° indirect vision system were implemented during a target detection task while participants were concurrently immersed in a motion simulator that mimicked off-road terrain which was completely separate from the target detection route. Results of a multiple regression analysis determined that the Dual Banners display incorporating an artificial horizon (i.e., AH Dual Banners) and perceived attentional control significantly contributed to the outcome of total severity of motion sickness, as measured by the Simulator Sickness Questionnaire (SSQ). Altogether, 33.6% (adjusted) of the variability in Total Severity was predicted by the variables used in the model. Objective measures were assessed prior to, during and after uncoupled motion. These tests involved performance while immersed in the environment (i.e., target detection and situation awareness), as well as postural stability and cognitive and visual assessment tests (i.e., Grammatical Reasoning and Manikin) both before and after immersion. Response time to Grammatical Reasoning actually decreased after uncoupled motion. However, this was the only significant difference of all the performance measures. Assessment of subjective workload (as measured by NASA-TLX) determined that participants in Dual Banners display conditions had a significantly lower level of perceived physical demand than those with Completely Separated display designs. Further, perceived iv temporal demand was lower for participants exposed to conditions incorporating an artificial horizon. Subjective sickness (SSQ Total Severity, Nausea, Oculomotor and Disorientation) was evaluated using non-parametric tests and confirmed that the AH Dual Banners display had significantly lower Total Severity scores than the Completely Separated display with no artificial horizon (i.e., NoAH Completely Separated). Oculomotor scores were also significantly different for these two conditions, with lower scores associated with AH Dual Banners. The NoAH Completely Separated condition also had marginally higher oculomotor scores when compared to the Completely Separated display incorporating the artificial horizon (AH Completely Separated). There were no significant differences of sickness symptoms or severity (measured by self-assessment, postural stability, and cognitive and visual tests) between display designs 30- and 60-minutes post-exposure. Further, 30- and 60- minute post measures were not significantly different from baseline scores, suggesting that aftereffects were not present up to 60 minutes post-exposure. It was concluded that incorporating an artificial horizon onto the Dual Banners display will be beneficial in mitigating symptoms of motion sickness in manned ground vehicles using 360° indirect vision systems. Screening for perceived attentional control will also be advantageous in situations where selection is possible. However, caution must be made in generalizing these results to missions under terrain or vehicle speed different than what is used for this study, as well as those that include a longer immersion time

    Remotely operated telepresent robotics

    Get PDF
    Remotely operated robots with the ability of performing specific tasks are often used in hazardous environments in place of humans to prevent injury or death. Modern remotely operated robots suffer from limitations with accuracy which is primarily due the lack of depth perception and unintuitive hardware controls. The undertaken research project suggests an alternative method of vision and control to increase a user‟s operational performance of remotely controlled robotics. The Oculus Rift Development Kit 2.0 is a low cost device originally developed for the electronic entertainment industry which allows users to experience virtual reality by the use of a head mounted display. This technology is able to be adapted to different uses and is primarily utilised to achieve real world stereoscopic 3D vision for the user. Additionally a wearable controller was trialled with the goal of allowing a robotic arm to mimic the position of the user‟s arm via a master/slave setup. By incorporating the stated vision and control methods, any possible improvements in the accuracy and speed for users was investigated through experimentation and a conducted study. Results indicated that using the Oculus Rift for stereoscopic vision improved upon the user‟s ability to judge distances remotely but was detrimental to the user‟s ability to operate the robot. The research has been conducted under the supervision of the University of Southern Queensland (USQ) and provides useful information towards the area of remotely operated telepresent robotics
    • …
    corecore