31 research outputs found

    Extracting and Visualizing Data from Mobile and Static Eye Trackers in R and Matlab

    Get PDF
    Eye tracking is the process of measuring where people are looking at with an eye tracker device. Eye tracking has been used in many scientific fields, such as education, usability research, sports, psychology, and marketing. Eye tracking data are often obtained from a static eye tracker or are manually extracted from a mobile eye tracker. Visualization usually plays an important role in the analysis of eye tracking data. So far, there existed no software package that contains a whole collection of eye tracking data processing and visualization tools. In this dissertation, we review the eye tracking technology, the eye tracking techniques, the existing software related to eye tracking, and the research on eye tracking for posters and related media. We then discuss the three main goals we have achieved in this dissertation: (i) development of a Matlab toolbox for automatically extracting mobile eye tracking data; (ii) development of the linked microposter plots family as new means for the visualization of eye tracking data; (iii) development of an R package for automatically extracting and visualizing data from mobile and static eye trackers

    Learning Visual Importance for Graphic Designs and Data Visualizations

    Full text link
    Knowing where people look and click on visual designs can provide clues about how the designs are perceived, and where the most important or relevant content lies. The most important content of a visual design can be used for effective summarization or to facilitate retrieval from a database. We present automated models that predict the relative importance of different elements in data visualizations and graphic designs. Our models are neural networks trained on human clicks and importance annotations on hundreds of designs. We collected a new dataset of crowdsourced importance, and analyzed the predictions of our models with respect to ground truth importance and human eye movements. We demonstrate how such predictions of importance can be used for automatic design retargeting and thumbnailing. User studies with hundreds of MTurk participants validate that, with limited post-processing, our importance-driven applications are on par with, or outperform, current state-of-the-art methods, including natural image saliency. We also provide a demonstration of how our importance predictions can be built into interactive design tools to offer immediate feedback during the design process

    Eye tracking and visualization. Introduction to the Special Thematic Issue

    Get PDF
    There is a growing interest in eye tracking technologies applied to support traditional visualization techniques like diagrams, charts, maps, or plots, either static, animated, or interactive ones. More complex data analyses are required to derive knowledge and meaning from the data. Eye tracking systems serve that purpose in combination with biological and computer vision, cognition, perception, visualization,  human-computer-interaction, as well as usability and user experience research. The 10 articles collected in this thematic special issue provide interesting examples how sophisticated methods of data analysis and representation enable researchers to discover and describe fundamental spatio-temporal regularities in the data. The human visual system, supported by appropriate visualization tools, enables the human operator to solve complex tasks, like understanding and interpreting three-dimensional medical images, controlling air traffic by radar displays, supporting instrument flight tasks, or interacting with virtual realities. The development and application of new visualization techniques is of major importance for future technological progress

    Neural Network Driven Eye Tracking Metrics and Data Visualization in Metaverse and Virtual Reality Maritime Safety Training

    Get PDF
    Understand the human brain, predict human performance, and proactively plan, strategize and act based on such information initiated a scientific multidisciplinary alliance to address modern management challenges. This paper integrates numerous advanced information technologies such as eye tracking, virtual reality and neural networks for cognitive task analysis leading to behavioral analysis on humans that perform specific activities. The technology developed and presented in this paper has been tested on a maritime safety training application for command bridge communication and procedures for collision avoidance. The technology integrates metaverse and virtual reality environments with eye tracking for the collection of behavioral data which are analyzed by a neural network to indicate the mental and physical state, attention and readiness of a seafarer to perform such a critical task. The paper demonstrates the technology architecture, data collection process, indicative results, and areas for further research

    Computer-aided screening of autism spectrum disorder: Eye-tracking study using data visualization and deep learning

    Get PDF
    Background: The early diagnosis of autism spectrum disorder (ASD) is highly desirable but remains a challenging task, which requires a set of cognitive tests and hours of clinical examinations. In addition, variations of such symptoms exist, which can make the identification of ASD even more difficult. Although diagnosis tests are largely developed by experts, they are still subject to human bias. In this respect, computer-assisted technologies can play a key role in supporting the screening process. Objective: This paper follows on the path of using eye tracking as an integrated part of screening assessment in ASD based on the characteristic elements of the eye gaze. This study adds to the mounting efforts in using eye tracking technology to support the process of ASD screening Methods: The proposed approach basically aims to integrate eye tracking with visualization and machine learning. A group of 59 school-aged participants took part in the study. The participants were invited to watch a set of age-appropriate photographs and videos related to social cognition. Initially, eye-tracking scanpaths were transformed into a visual representation as a set of images. Subsequently, a convolutional neural network was trained to perform the image classification task. Results: The experimental results demonstrated that the visual representation could simplify the diagnostic task and also attained high accuracy. Specifically, the convolutional neural network model could achieve a promising classification accuracy. This largely suggests that visualizations could successfully encode the information of gaze motion and its underlying dynamics. Further, we explored possible correlations between the autism severity and the dynamics of eye movement based on the maximal information coefficient. The findings primarily show that the combination of eye tracking, visualization, and machine learning have strong potential in developing an objective tool to assist in the screening of ASD. Conclusions: Broadly speaking, the approach we propose could be transferable to screening for other disorders, particularly neurodevelopmental disorders

    Gaze Self-Similarity Plot - A New Visualization Technique

    Get PDF
    Eye tracking has become a valuable way for extending knowledge of human behavior based on visual patterns. One of the most important elements of such an analysis is the presentation of obtained results, which proves to be a challenging task. Traditional visualization techniques such as scan-paths or heat maps may reveal interesting information, nonetheless many useful features are still not visible, especially when temporal characteristics of eye movement is taken into account. This paper introduces a technique called gaze self-similarity plot (GSSP) that may be applied to visualize both spatial and temporal eye movement features on the single two-dimensional plot. The technique is an extension of the idea of recurrence plots, commonly used in time series analysis. The paper presents the basic concepts of the proposed approach (two types of GSSP) complemented with some examples of what kind of information may be disclosed and finally showing areas of the GSSP possible applications

    Visual Multi-Metric Grouping of Eye-Tracking Data

    Get PDF
    We present an algorithmic and visual grouping of participants and eye-tracking metrics derived from recorded eye-tracking data. Our method utilizes two well-established visualization concepts. First, parallel coordinates are used to provide an overview of the used metrics, their interactions, and similarities, which helps select suitable metrics that describe characteristics of the eye-tracking data. Furthermore, parallel coordinates plots enable an analyst to test the effects of creating a combination of a subset of metrics resulting in a newly derived eye-tracking metric. Second, a similarity matrix visualization is used to visually represent the affine combination of metrics utilizing an algorithmic grouping of subjects that leads to distinct visual groups of similar behavior. To keep the diagrams of the matrix visualization simple and understandable, we visually encode our eye- tracking data into the cells of a similarity matrix of participants. The algorithmic grouping is performed with a clustering based on the affine combination of metrics, which is also the basis for the similarity value computation of the similarity matrix. To illustrate the usefulness of our visualization, we applied it to an eye-tracking data set involving the reading behavior of metro maps of up to 40 participants. Finally, we discuss limitations and scalability issues of the approach focusing on visual and perceptual issues
    corecore