3 research outputs found
Benefits of temporal information for appearance-based gaze estimation
State-of-the-art appearance-based gaze estimation methods, usually based on
deep learning techniques, mainly rely on static features. However, temporal
trace of eye gaze contains useful information for estimating a given gaze
point. For example, approaches leveraging sequential eye gaze information when
applied to remote or low-resolution image scenarios with off-the-shelf cameras
are showing promising results. The magnitude of contribution from temporal gaze
trace is yet unclear for higher resolution/frame rate imaging systems, in which
more detailed information about an eye is captured. In this paper, we
investigate whether temporal sequences of eye images, captured using a
high-resolution, high-frame rate head-mounted virtual reality system, can be
leveraged to enhance the accuracy of an end-to-end appearance-based
deep-learning model for gaze estimation. Performance is compared against a
static-only version of the model. Results demonstrate statistically-significant
benefits of temporal information, particularly for the vertical component of
gaze.Comment: In ACM Symposium on Eye Tracking Research & Applications (ETRA), 202
An end-to-end review of gaze estimation and its interactive applications on handheld mobile devices
In recent years we have witnessed an increasing number of interactive systems on handheld mobile devices which utilise gaze as a single or complementary interaction modality. This trend is driven by the enhanced computational power of these devices, higher resolution and capacity of their cameras, and improved gaze estimation accuracy obtained from advanced machine learning techniques, especially in deep learning. As the literature is fast progressing, there is a pressing need to review the state of the art, delineate the boundary, and identify the key research challenges and opportunities in gaze estimation and interaction. This paper aims to serve this purpose by presenting an end-to-end holistic view in this area, from gaze capturing sensors, to gaze estimation workflows, to deep learning techniques, and to gaze interactive applications.PostprintPeer reviewe