23 research outputs found

    On the Visual Input Driving Human Smooth-Pursuit Eye Movements

    Get PDF
    Current computational models of smooth-pursuit eye movements assume that the primary visual input is local retinal-image motion (often referred to as retinal slip). However, we show that humans can pursue object motion with considerable accuracy, even in the presence of conflicting local image motion. This finding indicates that the visual cortical area(s) controlling pursuit must be able to perform a spatio-temporal integration of local image motion into a signal related to object motion. We also provide evidence that the object-motion signal that drives pursuit is related to the signal that supports perception. We conclude that current models of pursuit should be modified to include a visual input that encodes perceived object motion and not merely retinal image motion. Finally, our findings suggest that the measurement of eye movements can be used to monitor visual perception, with particular value in applied settings as this non-intrusive approach would not require interrupting ongoing work or training

    Evaluating Fault Management Operations Concepts for Next-Generation Spacecraft: What Eye Movements Tell Us

    Get PDF
    Performance enhancements associated with selected forms of automation were quantified in a recent human-in-the-loop evaluation of two candidate operational concepts for fault management on next-generation spacecraft. The baseline concept, called Elsie, featured a full-suite of "soft" fault management interfaces. However, operators were forced to diagnose malfunctions with minimal assistance from the standalone caution and warning system. The other concept, called Besi, incorporated a more capable C&W system with an automated fault diagnosis capability. Results from analyses of participants' eye movements indicate that the greatest empirical benefit of the automation stemmed from eliminating the need for text processing on cluttered, text-rich displays

    Results from Testing Crew-Controlled Surface Telerobotics on the International Space Station

    Get PDF
    During Summer 2013, the Intelligent Robotics Group at NASA Ames Research Center conducted a series of tests to examine how astronauts in the International Space Station (ISS) can remotely operate a planetary rover. The tests simulated portions of a proposed lunar mission, in which an astronaut in lunar orbit would remotely operate a planetary rover to deploy a radio telescope on the lunar far side. Over the course of Expedition 36, three ISS astronauts remotely operated the NASA "K10" planetary rover in an analogue lunar terrain located at the NASA Ames Research Center in California. The astronauts used a "Space Station Computer" (crew laptop), a combination of supervisory control (command sequencing) and manual control (discrete commanding), and Ku-band data communications to command and monitor K10 for 11 hours. In this paper, we present and analyze test results, summarize user feedback, and describe directions for future research

    Considerations for the Use of Remote Gaze Tracking to Assess Behavior in Flight Simulators

    Get PDF
    Complex user interfaces (such as those found in an aircraft cockpit) may be designed from first principles, but inevitably must be evaluated with real users. User gaze data can provide valuable information that can help to interpret other actions that change the state of the system. However, care must be taken to ensure that any conclusions drawn from gaze data are well supported. Through a combination of empirical and simulated data, we identify several considerations and potential pitfalls when measuring gaze behavior in high-fidelity simulators. We show that physical layout, behavioral differences, and noise levels can all substantially alter the quality of fit for algorithms that segment gaze measurements into individual fixations. We provide guidelines to help investigators ensure that conclusions drawn from gaze tracking data are not artifactual consequences of data quality or analysis techniques

    Stroboscopic Image Modulation to Reduce the Visual Blur of an Object Being Viewed by an Observer Experiencing Vibration

    Get PDF
    A method and apparatus for reducing the visual blur of an object being viewed by an observer experiencing vibration. In various embodiments of the present invention, the visual blur is reduced through stroboscopic image modulation (SIM). A SIM device is operated in an alternating "on/off" temporal pattern according to a SIM drive signal (SDS) derived from the vibration being experienced by the observer. A SIM device (controlled by a SIM control system) operates according to the SDS serves to reduce visual blur by "freezing" (or reducing an image's motion to a slow drift) the visual image of the viewed object. In various embodiments, the SIM device is selected from the group consisting of illuminator(s), shutter(s), display control system(s), and combinations of the foregoing (including the use of multiple illuminators, shutters, and display control systems)

    Effects of Transverse Seat Vibration on Near-Viewing Readability of Alphanumeric Symbology

    Get PDF
    We measured the impacts on human visual function of a range of vibration levels (0.15, 0.3, 0.5, and 0.7 g) at the frequency and along the axis of the anticipated Ares thrust oscillation. We found statistically significant and equivalent decrements in performance on a reading and a numeric processing task at tested vibration levels above 0.3 g (0-to-peak), but no evidence of after-effects. At the smallest font and highest vibration level tested, the average effect was a 50 percent increase in response time and six-fold increase in errors. Our findings support a preliminary trade space in which currently planned Orion font sizes and text spacing appear to be too small to support accurate and efficient reading at the tested vibration levels above 0.3 g, but not too small to support reading at 0.3 g. This study does not address potential impacts on crew cognitive decision-making or motor control and does not test either the full induced Orion-Ares environment with its sustained Gx-loading or the full complexity of the final Orion seat-helmet-suit interface. A final determination of the Orion-Ares program limit on vibration must take these additional factors into consideration and, thus, may need to be lower than that needed to support effective reading at 1-Gx bias

    Influence of Combined Whole-Body Vibration Plus G-Loading on Visual Performance

    Get PDF
    Recent engineering analyses of the integrated Ares-Orion stack show that vibration levels for Orion crews have the potential to be much higher than those experienced in Gemini, Apollo, and Shuttle vehicles. Of particular concern to the Constellation Program (CxP) is the 12 Hz thrust oscillation (TO) that the Ares-I rocket develops during the final ~20 seconds preceding first-stage separation, at maximum G-loading. While the structural-dynamic mitigations being considered can assure that vibration due to TO is reduced to below the CxP crew health limit, it remains to be determined how far below this limit vibration must be reduced to enable effective crew performance during launch. Moreover, this "performance" vibration limit will inform the operations concepts (and crew-system interface designs) for this critical phase of flight. While Gemini and Apollo studies provide preliminary guidance, the data supporting the historical limits were obtained using less advanced interface technologies and very different operations concepts. In this study, supported by the Exploration Systems Mission Directorate (ESMD) Human Research Program, we investigated display readability-a fundamental prerequisite for any interaction with electronic crew-vehicle interfaces-while observers were subjected to 12 Hz vibration superimposed on the 3.8 G loading expected for the TO period of ascent. Two age-matched groups of participants (16 general population and 13 Crew Office) performed a numerical display reading task while undergoing sustained 3.8 G loading and whole-body vibration at 0, 0.15, 0.3, 0.5, and 0.7 g in the eyeballs in/out (x-axis) direction. The time-constrained reading task used an Orion-like display with 10- and 14-pt non-proportional sans-serif fonts, and was designed to emulate the visual acquisition and processing essential for crew system monitoring. Compared to the no-vibration baseline, we found no significant effect of vibration at 0.15 and 0.3 g on task error rates (ER) or response times (RT). Significant degradations in both ER and RT, however, were observed at 0.5 and 0.7 g for 10-pt, and at 0.7 g for 14-pt font displays. These objective performance measures were mirrored by participants' subjective ratings. Interestingly, we found that the impact of vibration on ER increased with distance from the center of the display, but only for vertical displacements. Furthermore, no significant ER or RT aftereffects were detected immediately following vibration, regardless of amplitude. Lastly, given that our reading task required no specialized spaceflight expertise, our finding that effects were not statistically distinct between our two groups is not surprising. The results from this empirical study provide initial guidance for evaluating the display readability trade-space between text-font size and vibration amplitude. However, the outcome of this work should be considered preliminary in nature for a number of reasons: 1. The single 12 Hz x-axis vibration employed was based on earlier load-cycle models of the induced TO environment at the end of Ares-I first stage flight. Recent analyses of TO mitigation designs suggest that significant concurrent off-axis vibration may also occur. 2. The shirtsleeve environment in which we tested fails to capture the full kinematic and dynamic complexity of the physical interface between crewmember and the still-to-bematured helmet-suit-seat designs, and the impact these will have for vibration transmission and consequent performance. 3. By examining performance in this reading and number processing task, we are only assessing readability, a first and necessary step that in itself does not directly address the performance of more sophisticated operational tasks such as vehicle-health monitoring or manual control of the vehicle

    Information Presentation

    Get PDF
    The goal of the Information Presentation Directed Research Project (DRP) is to address design questions related to the presentation of information to the crew on flight vehicles, surface landers and habitats, and during extra-vehicular activities (EVA). Designers of displays and controls for exploration missions must be prepared to select the text formats, label styles, alarms, electronic procedure designs, and cursor control devices that provide for optimal crew performance on exploration tasks. The major areas of work, or subtasks, within the Information Presentation DRP are: 1) Controls, 2) Displays, 3) Procedures, and 4) EVA Operations
    corecore