1,144 research outputs found

    Unobtrusive and pervasive video-based eye-gaze tracking

    Get PDF
    Eye-gaze tracking has long been considered a desktop technology that finds its use inside the traditional office setting, where the operating conditions may be controlled. Nonetheless, recent advancements in mobile technology and a growing interest in capturing natural human behaviour have motivated an emerging interest in tracking eye movements within unconstrained real-life conditions, referred to as pervasive eye-gaze tracking. This critical review focuses on emerging passive and unobtrusive video-based eye-gaze tracking methods in recent literature, with the aim to identify different research avenues that are being followed in response to the challenges of pervasive eye-gaze tracking. Different eye-gaze tracking approaches are discussed in order to bring out their strengths and weaknesses, and to identify any limitations, within the context of pervasive eye-gaze tracking, that have yet to be considered by the computer vision community.peer-reviewe

    A Review and Analysis of Eye-Gaze Estimation Systems, Algorithms and Performance Evaluation Methods in Consumer Platforms

    Full text link
    In this paper a review is presented of the research on eye gaze estimation techniques and applications, that has progressed in diverse ways over the past two decades. Several generic eye gaze use-cases are identified: desktop, TV, head-mounted, automotive and handheld devices. Analysis of the literature leads to the identification of several platform specific factors that influence gaze tracking accuracy. A key outcome from this review is the realization of a need to develop standardized methodologies for performance evaluation of gaze tracking systems and achieve consistency in their specification and comparative evaluation. To address this need, the concept of a methodological framework for practical evaluation of different gaze tracking systems is proposed.Comment: 25 pages, 13 figures, Accepted for publication in IEEE Access in July 201

    EYE AND GAZE TRACKING ALGORITHM FOR COLLABORATIVE LEARNING SYSTEM

    Get PDF
    International audienceOur work focuses on the interdisciplinary field of detailed analysis of behaviors exhibited by individuals during sessions of distributed collaboration. With a particular focus on ergonomics, we propose new mechanisms to be integrated into existing tools to enable increased productivity in distributed learning and working. Our technique is to record ocular movements (eye tracking) to analyze various scenarios of distributed collaboration in the context of computer-based training. In this article, we present a low-cost oculometric device that is capable of making ocular measurements without interfering with the natural behavior of the subject. We expect that this device could be employed anywhere that a natural, non-intrusive method of observation is required, and its low-cost permits it to be readily integrated into existing popular tools, particularly E-learning campus

    Pupil Position by an Improved Technique of YOLO Network for Eye Tracking Application

    Get PDF
    This Eye gaze following is the real-time collection of information about a person's eye movements and the direction of their look. Eye gaze trackers are devices that measure the locations of the pupils to detect and track changes in the direction of the user's gaze. There are numerous applications for analyzing eye movements, from psychological studies to human-computer interaction-based systems and interactive robotics controls. Real-time eye gaze monitoring requires an accurate and reliable iris center localization technique. Deep learning technology is used to construct a pupil tracking approach for wearable eye trackers in this study. This pupil tracking method uses deep-learning You Only Look Once (YOLO) model to accurately estimate and anticipate the pupil's central location under conditions of bright, natural light (visible to the naked eye). Testing pupil tracking performance with the upgraded YOLOv7 results in an accuracy rate of 98.50% and a precision rate close to 96.34% using PyTorch
    • …
    corecore