1,761 research outputs found

    Eye movement planning on Single-Sensor-Single-Indicator displays is vulnerable to user anxiety and cognitive load

    Get PDF
    In this study, we demonstrate the effects of anxiety and cognitive load on eye movement planning in an instrument flight task adhering to a single-sensor-single-indicator data visualisation design philosophy. The task was performed in neutral and anxiety conditions, while a low or high cognitive load, auditory n-back task was also performed. Cognitive load led to a reduction in the number of transitions between instruments, and impaired task performance. Changes in self-reported anxiety between the neutral and anxiety conditions positively correlated with changes in the randomness of eye movements between instruments, but only when cognitive load was high. Taken together, the results suggest that both cognitive load and anxiety impact gaze behavior, and that these effects should be explored when designing data visualization displays

    Steering Demands Diminish the Early-P3, Late-P3 and RON Components of the Event-Related Potential of Task-Irrelevant Environmental Sounds

    Get PDF
    The current study investigates the demands that steering places on mental resources. Instead of a conventional dual-task paradigm, participants of this study were only required to perform a steering task while task-irrelevant auditory distractor probes (environmental sounds and beep tones) were intermittently presented. The event-related potentials (ERPs), which were generated by these probes, were analyzed for their sensitivity to the steering task’s demands. The steering task required participants to counteract unpredictable roll disturbances and difficulty was manipulated either by adjusting the bandwidth of the roll disturbance or by varying the complexity of the control dynamics. A mass univariate analysis revealed that steering selectively diminishes the amplitudes of early P3, late P3, and the re-orientation negativity (RON) to task-irrelevant environmental sounds but not to beep tones. Our findings are in line with a three-stage distraction model, which interprets these ERPs to reflect the post-sensory detection of the task-irrelevant stimulus, engagement, and re-orientation back to the steering task. This interpretation is consistent with our manipulations for steering difficulty. More participants showed diminished amplitudes for these ERPs in the ‘hard’ steering condition relative to the ‘easy’ condition. To sum up, the current work identifies the spatiotemporal ERP components of task-irrelevant auditory probes that are sensitive to steering demands on mental resources. This provides a non-intrusive method for evaluating mental workload in novel steering environments

    Learned Non-Rigid Object Motion is a View-Invariant Cue to Recognizing Novel Objects

    Get PDF
    There is evidence that observers use learned object motion to recognize objects. For instance, studies have shown that reversing the learned direction in which a rigid object rotated in depth impaired recognition accuracy. This motion reversal can be achieved by playing animation sequences of moving objects in reverse frame order. In the current study, we used this sequence-reversal manipulation to investigate whether observers encode the motion of dynamic objects in visual memory, and whether such dynamic representations are encoded in a way that is dependent on the viewing conditions. Participants first learned dynamic novel objects, presented as animation sequences. Following learning, they were then tested on their ability to recognize these learned objects when their animation sequence was shown in the same sequence order as during learning or in the reverse sequence order. In Experiment 1, we found that non-rigid motion contributed to recognition performance; that is, sequence-reversal decreased sensitivity across different tasks. In subsequent experiments, we tested the recognition of non-rigidly deforming (Experiment 2) and rigidly rotating (Experiment 3) objects across novel viewpoints. Recognition performance was affected by viewpoint changes for both experiments. Learned non-rigid motion continued to contribute to recognition performance and this benefit was the same across all viewpoint changes. By comparison, learned rigid motion did not contribute to recognition performance. These results suggest that non-rigid motion provides a source of information for recognizing dynamic objects, which is not affected by changes to viewpoint

    VisME: Visual microsaccades explorer

    Get PDF
    This work presents a visual analytics approach to explore microsaccade distributions in high-frequency eye tracking data. Research studies often apply filter algorithms and parameter values for microsaccade detection. Even when the same algorithms are employed, different parameter values might be adopted across different studies. In this paper, we present a visual analytics system (VisME) to promote reproducibility in the data analysis of microsaccades. It allows users to interactively vary the parametric values for microsaccade filters and evaluate the resulting influence on microsaccade behavior across individuals and on a group level. In particular, we exploit brushing-and-linking techniques that allow the microsaccadic properties of space, time, and movement direction to be extracted, visualized, and compared across multiple views. We demonstrate in a case study the use of our visual analytics system on data sets collected from natural scene viewing and show in a qualitative usability study the usefulness of this approach for eye tracking researchers. We believe that interactive tools such as VisME will promote greater transparency in eye movement research by providing researchers with the ability to easily understand complex eye tracking data sets; such tools can also serve as teaching systems. VisME is provided as open source software

    Effects of anxiety and cognitive load on instrument scanning behavior in a flight simulation

    Get PDF
    Previous research has rarely examined the combined influence of anxiety and cognitive load on gaze behavior and performance whilst undertaking complex perceptual-motor tasks. In the current study, participants performed an aviation instrument landing task in neutral and anxiety conditions, while performing a low or high cognitive load auditory n-back task. Both self-reported anxiety and heart rate increased from neutral conditions indicating that anxiety was successfully manipulated. Response accuracy and reaction time for the auditory task indicated that cognitive load was also successfully manipulated. Cognitive load negatively impacted flight performance and the frequency of gaze transitions between areas of interest. Performance was maintained in anxious conditions, with a concomitant decrease in n-back reaction time suggesting that this was due to an increase in mental effort. Analyses of individual responses to the anxiety manipulation revealed that changes in anxiety levels from neutral to anxiety conditions were positively correlated with changes in visual scanning entropy, which is a measure of the randomness of gaze behavior, but only when cognitive load was high. This finding lends support for an interactive effect of cognitive anxiety and cognitive load on attentional control

    Cross-timescale experience evaluation framework for productive teaming

    Get PDF
    This paper presents the initial concept for an evaluation framework to systematically evaluate productive teaming (PT). We consider PT as adaptive human-machine interactions between human users and augmented technical production systems. Also, human-to-human communication as part of a hybrid team with multiple human actors is considered, as well as human-human and human-machine communication for remote and mixed remote- and co-located teams. The evaluation comprises objective, performance-related success indicators, behavioral metadata, and measures of human experience. In particular, it considers affective, attentional and intentional states of human team members, their influence on interaction dynamics in the team, and researches appropriate strategies to satisfyingly adjust dysfunctional dynamics, using concepts of companion technology. The timescales under consideration span from seconds to several minutes, with selected studies targeting hour-long interactions and longer-term effects such as effort and fatigue. Two example PT scenarios will be discussed in more detail. To enable generalization and a systematic evaluation, the scenarios’ use cases will be decomposed into more general modules of interaction

    Virtual Reality Adaptation Using Electrodermal Activity to Support the User Experience

    Get PDF
    Virtual reality is increasingly used for tasks such as work and education. Thus, rendering scenarios that do not interfere with such goals and deplete user experience are becoming progressively more relevant. We present a physiologically adaptive system that optimizes the virtual environment based on physiological arousal, i.e., electrodermal activity. We investigated the usability of the adaptive system in a simulated social virtual reality scenario. Participants completed an n-back task (primary) and a visual detection (secondary) task. Here, we adapted the visual complexity of the secondary task in the form of the number of non-player characters of the secondary task to accomplish the primary task. We show that an adaptive virtual reality can improve users' comfort by adapting to physiological arousal regarding the task complexity. Our findings suggest that physiologically adaptive virtual reality systems can improve users' experience in a wide range of scenarios

    Eye tracking and visualization. Introduction to the Special Thematic Issue

    Get PDF
    There is a growing interest in eye tracking technologies applied to support traditional visualization techniques like diagrams, charts, maps, or plots, either static, animated, or interactive ones. More complex data analyses are required to derive knowledge and meaning from the data. Eye tracking systems serve that purpose in combination with biological and computer vision, cognition, perception, visualization,  human-computer-interaction, as well as usability and user experience research. The 10 articles collected in this thematic special issue provide interesting examples how sophisticated methods of data analysis and representation enable researchers to discover and describe fundamental spatio-temporal regularities in the data. The human visual system, supported by appropriate visualization tools, enables the human operator to solve complex tasks, like understanding and interpreting three-dimensional medical images, controlling air traffic by radar displays, supporting instrument flight tasks, or interacting with virtual realities. The development and application of new visualization techniques is of major importance for future technological progress

    Asymmetric saccade reaction times to smooth pursuit

    Get PDF
    Before initiating a saccade to a moving target, the brain must take into account the target’s eccentricity as well as its movement direction and speed. We tested how the kinematic characteristics of the target influence the time course of this oculomotor response. Participants performed a step-ramp task in which the target object stepped from a central to an eccentric position and moved at constant velocity either to the fixation position (foveopetal) or further to the periphery (foveofugal). The step size and target speed were varied. Of particular interest were trials that exhibited an initial saccade prior to a smooth pursuit eye movement. Measured saccade reaction times were longer in the foveopetal than in the foveofugal condition. In the foveopetal (but not the foveofugal) condition, the occurrence of an initial saccade, its reaction time as well as the strength of the pre-saccadic pursuit response depended on both the target’s speed and the step size. A common explanation for these results may be found in the neural mechanisms that select between oculomotor response alternatives, i.e., a saccadic or smooth response
    corecore