27 research outputs found
TobiiGlassesPySuite: An open-source suite for using the Tobii Pro Glasses 2 in eye-tracking studies
In this paper we present the TobiiGlassesPySuite, an open-source suite we
implemented for using the Tobii Pro Glasses 2 wearable eye-tracker in custom
eye-tracking studies. We provide a platform-independent solution for
controlling the device and for managing the recordings. The software consists
of Python modules, integrated into a single package, accompanied by sample
scripts and recordings. The proposed solution aims at providing additional
methods with respect to the manufacturer's software, for allowing the users to
exploit more the device's capabilities and the existing software. Our suite is
available for download from the repository indicated in the paper and usable
according to the terms of the GNU GPL v3.0 license
EyeMMV toolbox: An eye movement post-analysis tool based on a two-step spatial dispersion threshold for fixation identification
Eye movement recordings and their analysis constitute an effective way to examine visual perception. There is a special need for the design of computer software for the performance of data analysis. The present study describes the development of a new toolbox, called EyeMMV (Eye Movements Metrics & Visualizations), for post experimental eye movement analysis. The detection of fixation events is performed with the use of an introduced algorithm based on a two-step spatial dispersion threshold. Furthermore, EyeMMV is designed to support all well-known eye tracking metrics and visualization techniques. The results of fixation identification algorithm are compared with those of an algorithm of dispersion-type with a moving window, imported in another open source analysis tool. The comparison produces outputs that are strongly correlated. The EyeMMV software is developed using the scripting language of MATLAB and the source code is distributed through GitHub under the third version of GNU General Public License (link: https://github.com/krasvas/EyeMMV)
Detection of moving point symbols on cartographic backgrounds
The present paper presents the performance of an experimental cartographic study towards the examination of the minimum duration threshold required for the detection by the central vision of a moving point symbol on cartographic backgrounds. The examined threshold is investigated using backgrounds with discriminant levels of information. The experimental process is based on the collection (under free viewing conditions) and the analysis of eye movement recordings. The computation of fixation derived statistical metrics allows the calculation of the examined threshold as well as the study of the general visual reaction of map users. The critical duration threshold calculated within the present study corresponds to a time span around 400msec. The results of the analysis indicate meaningful evidences about these issues while the suggested approach can be applied towards the examination of perception thresholds related to changes occurred on dynamic stimuli
Motion velocity as a preattentive feature in cartographic symbolization
The presented study aims to examine the process of preattentive processing of dynamic point symbols used in cartographic symbology. More specifically, we explore different motion types of geometric symbols on a map together with various motion velocity distribution scales. The main hypothesis is that, in specific cases, motion velocity of dynamic point symbols is the feature that could be perceived preattentively on a map. In a controlled laboratory experiment, with 103 participants and eye tracking methods, we used administrative border maps with animated symbols. Participants’ task was to find and precisely identify the fastest changing symbol. It turned out that not every type of motion could be perceived preattentively even though the motion distribution scale did not change. The same applied to symbols’ shape. Eye movement analysis revealed that successful detection was closely related to the fixation on the target after initial preattentive vision. This confirms a significant role of the motion velocity distribution and the usage of symbols’ shape in cartographic design of animated maps
The eyes know it: FakeET -- An Eye-tracking Database to Understand Deepfake Perception
We present \textbf{FakeET}-- an eye-tracking database to understand human
visual perception of \emph{deepfake} videos. Given that the principal purpose
of deepfakes is to deceive human observers, FakeET is designed to understand
and evaluate the ease with which viewers can detect synthetic video artifacts.
FakeET contains viewing patterns compiled from 40 users via the \emph{Tobii}
desktop eye-tracker for 811 videos from the \textit{Google Deepfake} dataset,
with a minimum of two viewings per video. Additionally, EEG responses acquired
via the \emph{Emotiv} sensor are also available. The compiled data confirms (a)
distinct eye movement characteristics for \emph{real} vs \emph{fake} videos;
(b) utility of the eye-track saliency maps for spatial forgery localization and
detection, and (c) Error Related Negativity (ERN) triggers in the EEG
responses, and the ability of the \emph{raw} EEG signal to distinguish between
\emph{real} and \emph{fake} videos.Comment: 8 page
Aggregated Gaze Data Visualization Using Contiguous Irregular Cartograms
Gaze data visualization constitutes one of the most critical processes during eye-tracking analysis. Considering that modern devices are able to collect gaze data in extremely high frequencies, the visualization of the collected aggregated gaze data is quite challenging. In the present study, contiguous irregular cartograms are used as a method to visualize eye-tracking data captured by several observers during the observation of a visual stimulus. The followed approach utilizes a statistical grayscale heatmap as the main input and, hence, it is independent of the total number of the recorded raw gaze data. Indicative examples, based on different parameters/conditions and heatmap grid sizes, are provided in order to highlight their influence on the final image of the produced visualization. Moreover, two analysis metrics, referred to as center displacement (CD) and area change (AC), are proposed and implemented in order to quantify the geometric changes (in both position and area) that accompany the topological transformation of the initial heatmap grids, as well as to deliver specific guidelines for the execution of the used algorithm. The provided visualizations are generated using open-source software in a geographic information system
Monitoring Human Visual Behavior during the Observation of Unmanned Aerial Vehicles (UAVs) Videos
The present article describes an experimental study towards the examination of human visual behavior during the observation of unmanned aerial vehicles (UAVs) videos. Experimental performance is based on the collection and the quantitative & qualitative analysis of eye tracking data. The results highlight that UAV flight altitude serves as a dominant specification that affects the visual attention process, while the presence of sky in the video background seems to be the less affecting factor in this procedure. Additionally, the main surrounding environment, the main size of the observed object as well as the main perceived angle between UAV’s flight plain and ground appear to have an equivalent influence in observers’ visual reaction during the exploration of such stimuli. Moreover, the provided heatmap visualizations indicate the most salient locations in the used UAVs videos. All produced data (raw gaze data, fixation and saccade events, and heatmap visualizations) are freely distributed to the scientific community as a new dataset (EyeTrackUAV) that can be served as an objective ground truth in future studies
Eye Tracking Research in Cartography: Looking into the Future
Eye tracking has been served as one of the most objective and valuable tools towards the examination of both map perceptual and cognitive processes. The aim of the present article is to concisely present the contribution of eye tracking research in cartography, indicating the existing literature, as well as the current research trends in the examined domain. The authors discuss the existing challenges and provide their perspectives about the future outlook of cartographic eye tracking experimentation by reporting specific key approaches that could be integrated