170,730 research outputs found
Trends and Techniques in Visual Gaze Analysis
Visualizing gaze data is an effective way for the quick interpretation of eye
tracking results. This paper presents a study investigation benefits and
limitations of visual gaze analysis among eye tracking professionals and
researchers. The results were used to create a tool for visual gaze analysis
within a Master's project.Comment: pages 89-93, The 5th Conference on Communication by Gaze Interaction
- COGAIN 2009: Gaze Interaction For Those Who Want It Most, ISBN:
978-87-643-0475-
Quantifying Pilot Visual Attention in Low Visibility Terminal Operations
Quantifying pilot visual behavior allows researchers to determine not only where a pilot is looking and when, but holds implications for specific behavioral tracking when these data are coupled with flight technical performance. Remote eye tracking systems have been integrated into simulators at NASA Langley with effectively no impact on the pilot environment. This paper discusses the installation and use of a remote eye tracking system. The data collection techniques from a complex human-in-the-loop (HITL) research experiment are discussed; especially, the data reduction algorithms and logic to transform raw eye tracking data into quantified visual behavior metrics, and analysis methods to interpret visual behavior. The findings suggest superior performance for Head-Up Display (HUD) and improved attentional behavior for Head-Down Display (HDD) implementations of Synthetic Vision System (SVS) technologies for low visibility terminal area operations. Keywords: eye tracking, flight deck, NextGen, human machine interface, aviatio
Visual analytics methodology for eye movement studies
Eye movement analysis is gaining popularity as a tool for evaluation of visual displays and interfaces. However, the existing methods and tools for analyzing eye movements and scanpaths are limited in terms of the tasks they can support and effectiveness for large data and data with high variation. We have performed an extensive empirical evaluation of a broad range of visual analytics methods used in analysis of geographic movement data. The methods have been tested for the applicability to eye tracking data and the capability to extract useful knowledge about users' viewing behaviors. This allowed us to select the suitable methods and match them to possible analysis tasks they can support. The paper describes how the methods work in application to eye tracking data and provides guidelines for method selection depending on the analysis tasks
Towards a framework for analysis of eye-tracking studies in the three dimensional environment: a study of visual search by experienced readers of endoluminal CT colonography.
Objective: Eye tracking in three dimensions is novel, but established descriptors derived from two-dimensional (2D) studies are not transferable. We aimed to develop metrics suitable for statistical comparison of eye-tracking data obtained from readers of three-dimensional (3D) “virtual” medical imaging, using CT colonography (CTC) as a typical example.
Methods: Ten experienced radiologists were eye tracked while observing eight 3D endoluminal CTC videos. Sub-sequently, we developed metrics that described their visual search patterns based on concepts derived from 2D gaze studies. Statistical methods were developed to allow analysis of the metrics.
Results: Eye tracking was possible for all readers. Visual dwell on the moving region of interest (ROI) was defined as pursuit of the moving object across multiple frames. Using this concept of pursuit, five categories of metrics were defined that allowed characterization of reader gaze behaviour. These were time to first pursuit, identi-fication and assessment time, pursuit duration, ROI size and pursuit frequency. Additional subcategories allowed us to further characterize visual search between readers in the test population.
Conclusion: We propose metrics for the characterization of visual search of 3D moving medical images. These metrics can be used to compare readers’ visual search patterns and provide a reproducible framework for the analysis of gaze tracking in the 3D environment. Advances in knowledge: This article describes a novel set of metrics that can be used to describe gaze behaviour when eye tracking readers during interpretation of 3D medical images. These metrics build on those established for 2D eye tracking and are applicable to increasingly common 3D medical image displays
Analysis of Eye-Tracking Data in Visualization and Data Space
Eye-tracking devices can tell us where on the screen a person is looking. Researchers frequently analyze eye-tracking data manually, by examining every frame of a visual stimulus used in an eye-tracking experiment so as to match 2D screen-coordinates provided by the eye-tracker to related objects and content within the stimulus. Such task requires significant manual effort and is not feasible for analyzing data collected from many users, long experimental sessions, and heavily interactive and dynamic visual stimuli. In this dissertation, we present a novel analysis method. We would instrument visualizations that have open source code, and leverage real-time information about the layout of the rendered visual content, to automatically relate gaze-samples to visual objects drawn on the screen. Since such visual objects are shown in a visualization stand for data, the method would allow us to necessarily detect data that users focus on or Data of Interest (DOI).
This dissertation has two contributions. First, we demonstrated the feasibility of collecting DOI data for real life visualization in a reliable way which is not self-evident. Second, we formalized the process of collecting and interpreting DOI data and test whether the automated DOI detection can lead to research workflows, and insights not possible with traditional, manual approaches
Recommended from our members
Eye Tracking Support for Visual Analytics Systems
Visual analytics (VA) research provides helpful solutions for interactive visual data analysis when exploring large and complex datasets. Due to recent advances in eye tracking technology, promising opportunities arise to extend these traditional VA approaches. Therefore, we discuss foundations for eye tracking support in VA systems. We first review and discuss the structure and range of typical VA systems. Based on a widely used VA model, we present five comprehensive examples that cover a wide range of usage scenarios. Then, we demonstrate that the VA model can be used to systematically explore how concrete VA systems could be extended with eye tracking, to create supportive and adaptive analytics systems. This allows us to identify general research and application opportunities, and classify them into research themes. In a call for action, we map the road for future research to broaden the use of eye tracking and advance visual analytics
Eye Tracking: A Perceptual Interface for Content Based Image Retrieval
In this thesis visual search experiments are devised to explore the feasibility of an eye gaze driven search mechanism. The thesis first explores gaze behaviour on images possessing different levels of saliency. Eye behaviour was predominantly attracted by salient locations, but appears to also require frequent reference to non-salient background regions which indicated that information from scan paths might prove useful for image search. The thesis then specifically investigates the benefits of eye tracking as an image retrieval interface in terms of speed relative to selection by mouse, and in terms of the efficiency of eye tracking mechanisms in the task of retrieving target images. Results are analysed using ANOVA and significant findings are discussed. Results show that eye selection was faster than a computer mouse and experience gained during visual tasks carried out using a mouse would benefit users if they were subsequently transferred to an eye tracking system. Results on the image retrieval experiments show that users are able to navigate to a target image within a database confirming the feasibility of an eye gaze driven search mechanism. Additional histogram analysis of the fixations, saccades and pupil diameters in the human eye movement data revealed a new method of extracting intentions from gaze behaviour for image search, of which the user was not aware and promises even quicker search performances. The research has two implications for Content Based Image Retrieval: (i) improvements in query formulation for visual search and (ii) new methods for visual search using attentional weighting. Futhermore it was demonstrated that users are able to find target images at sufficient speeds indicating that pre-attentive activity is playing a role in visual search. A current review of eye tracking technology, current applications, visual perception research, and models of visual attention is discussed. A review of the potential of the technology for commercial exploitation is also presented
Recommended from our members
Fauxvea: Crowdsourcing Gaze Location Estimates for Visualization Analysis Tasks
We present the design and evaluation of a method for estimating gaze locations during the analysis of static visualizations using crowdsourcing. Understanding gaze patterns is helpful for evaluating visualizations and user behaviors, but traditional eye-tracking studies require specialized hardware and local users. To avoid these constraints, we developed a method called Fauxvea, which crowdsources visualization tasks on the Web and estimates gaze fixations through cursor interactions without eye-tracking hardware. We ran experiments to evaluate how gaze estimates from our method compare with eye-tracking data. First, we evaluated crowdsourced estimates for three common types of information visualizations and basic visualization tasks using Amazon Mechanical Turk (MTurk). In another, we reproduced findings from a previous eye-tracking study on tree layouts using our method on MTurk. Results from these experiments show that fixation estimates using Fauxvea are qualitatively and quantitatively similar to eye tracking on the same stimulus-task pairs. These findings suggest that crowdsourcing visual analysis tasks with static information visualizations could be a viable alternative to traditional eye-tracking studies for visualization research and design
Gazealytics: A Unified and Flexible Visual Toolkit for Exploratory and Comparative Gaze Analysis
We present a novel, web-based visual eye-tracking analytics tool called
Gazealytics. Our open-source toolkit features a unified combination of gaze
analytics features that support flexible exploratory analysis, along with
annotation of areas of interest (AOI) and filter options based on multiple
criteria to visually analyse eye tracking data across time and space.
Gazealytics features coordinated views unifying spatiotemporal exploration of
fixations and scanpaths for various analytical tasks. A novel matrix
representation allows analysis of relationships between such spatial or
temporal features. Data can be grouped across samples, user-defined AOIs or
time windows of interest (TWIs) to support aggregate or filtered analysis of
gaze activity. This approach exceeds the capabilities of existing systems by
supporting flexible comparison between and within subjects, hypothesis
generation, data analysis and communication of insights. We demonstrate in a
walkthrough that Gazealytics supports multiple types of eye tracking datasets
and analytical tasks
- …