8 research outputs found
Direct Relationship Between Perceptual and Motor Variability
The time that elapses between stimulus onset and the onset of a saccadic eye movement is longer and more variable than can be explained by neural transmission times and synaptic delays (Carpenter, 1981, in: Eye Movements: Cognition & Visual Perception, Earlbaum). In theory, noise underlying response-time (RT) variability could arise at any point along the sensorimotor cascade, from sensory noise arising Vvithin the early visual processing shared Vvith perception to noise in the motor criterion or commands necessary to trigger movements. These two loci for internal noise can be distinguished empirically; sensory internal noise predicts that response time Vvill correlate Vvith perceived stimulus magnitude whereas motor internal noise predicts no such correlation. Methods. We used the data described by Liston and Stone (2008, JNS 28:13866-13875), in which subjects performed a 2AFC saccadic brightness discrimination task and the perceived brightness of the chosen stimulus was then quantified in a second 21FC perceptual task. Results. We binned each subject's data into quartiles for both signal strength (from dimmest to brightest) and RT (from slowest to fastest) and analyzed the trends in perceived brightness. We found significant effects of both signal strength (as expected) and RT on normalized perceived brightness (both p less than 0.0001, 2-way ANOVA), without significant interaction (p = 0.95, 2-way ANOVA). A plot of normalized perceived brightness versus normalized RT show's that more than half of the variance was shared (r2 = 0.56, P less than 0.0001). To rule out any possibility that some signal-strength related artifact was generating this effect, we ran a control analysis on pairs of trials with repeated presentations of identical stimuli and found that stimuli are perceived to be brighter on trials with faster saccades (p less than 0.001, paired t-test across subjects). Conclusion. These data show that shared early visual internal noise jitters perceived brightness and the saccadic motor output in parallel. While the present correlation could theoretically result, either directly or indirectly, from some low-level brainstem or retinal mechanism (e.g., arousal, pupil size, photoreceptor noise) that influences both visual and oculomotor circuits, this is unlikely given the earlier fin ding that the variability in perceived motion direction and smooth-pursuit motor output is highly correlated (Stone and Krauzlis, 2003, JOV 3:725-736), suggesting that cortical circuits contribute to the shared internal noise
Static and Motion-Based Visual Features Used by Airport Tower Controllers: Some Implications for the Design of Remote or Virtual Towers
Visual motion and other visual cues are used by tower controllers to provide important support for their control tasks at and near airports. These cues are particularly important for anticipated separation. Some of them, which we call visual features, have been identified from structured interviews and discussions with 24 active air traffic controllers or supervisors. The visual information that these features provide has been analyzed with respect to possible ways it could be presented at a remote tower that does not allow a direct view of the airport. Two types of remote towers are possible. One could be based on a plan-view, map-like computer-generated display of the airport and its immediate surroundings. An alternative would present a composite perspective view of the airport and its surroundings, possibly provided by an array of radially mounted cameras positioned at the airport in lieu of a tower. An initial more detailed analyses of one of the specific landing cues identified by the controllers, landing deceleration, is provided as a basis for evaluating how controllers might detect and use it. Understanding other such cues will help identify the information that may be degraded or lost in a remote or virtual tower not located at the airport. Some initial suggestions how some of the lost visual information may be presented in displays are mentioned. Many of the cues considered involve visual motion, though some important static cues are also discussed
Comprehensive Oculomotor Behavioral Response Assessment (COBRA)
An eye movement-based methodology and assessment tool may be used to quantify many aspects of human dynamic visual processing using a relatively simple and short oculomotor task, noninvasive video-based eye tracking, and validated oculometric analysis techniques. By examining the eye movement responses to a task including a radially-organized appropriately randomized sequence of Rashbass-like step-ramp pursuit-tracking trials, distinct performance measurements may be generated that may be associated with, for example, pursuit initiation (e.g., latency and open-loop pursuit acceleration), steady-state tracking (e.g., gain, catch-up saccade amplitude, and the proportion of the steady-state response consisting of smooth movement), direction tuning (e.g., oblique effect amplitude, horizontal-vertical asymmetry, and direction noise), and speed tuning (e.g., speed responsiveness and noise). This quantitative approach may provide fast and results (e.g., a multi-dimensional set of oculometrics and a single scalar impairment index) that can be interpreted by one without a high degree of scientific sophistication or extensive training
Effects of Spatio-Temporal Aliasing on Out-the-Window Visual Systems
Designers of out-the-window visual systems face a challenge when attempting to simulate the outside world as viewed from a cockpit. Many methodologies have been developed and adopted to aid in the depiction of particular scene features, or levels of static image detail. However, because aircraft move, it is necessary to also consider the quality of the motion in the simulated visual scene. When motion is introduced in the simulated visual scene, perceptual artifacts can become apparent. A particular artifact related to image motion, spatiotemporal aliasing, will be addressed. The causes of spatio-temporal aliasing will be discussed, and current knowledge regarding the impact of these artifacts on both motion perception and simulator task performance will be reviewed. Methods of reducing the impact of this artifact are also addresse
Considerations for the Use of Remote Gaze Tracking to Assess Behavior in Flight Simulators
Complex user interfaces (such as those found in an aircraft cockpit) may be designed from first principles, but inevitably must be evaluated with real users. User gaze data can provide valuable information that can help to interpret other actions that change the state of the system. However, care must be taken to ensure that any conclusions drawn from gaze data are well supported. Through a combination of empirical and simulated data, we identify several considerations and potential pitfalls when measuring gaze behavior in high-fidelity simulators. We show that physical layout, behavioral differences, and noise levels can all substantially alter the quality of fit for algorithms that segment gaze measurements into individual fixations. We provide guidelines to help investigators ensure that conclusions drawn from gaze tracking data are not artifactual consequences of data quality or analysis techniques
Reaching Errors Under G-Loading (and Vibration)
Humans show increased systematic and random errors when reaching for targets at 3.8Gx with or without added vibration