322 research outputs found

    3D Face tracking and gaze estimation using a monocular camera

    Get PDF
    Estimating a user’s gaze direction, one of the main novel user interaction technologies, will eventually be used for numerous applications where current methods are becoming less effective. In this paper, a new method is presented for estimating the gaze direction using Canonical Correlation Analysis (CCA), which finds a linear relationship between two datasets defining the face pose and the corresponding facial appearance changes. Afterwards, iris tracking is performed by blob detection using a 4-connected component labeling algorithm. Finally, a gaze vector is calculated based on gathered eye properties. Results obtained from datasets and real-time input confirm the robustness of this metho

    Precise Non-Intrusive Real-Time Gaze Tracking System for Embedded Setups

    Get PDF
    This paper describes a non-intrusive real-time gaze detection system, characterized by a precise determination of a subject's pupil centre. A narrow field-of-view camera (NFV), focused on one of the subject's eyes follows the head movements in order to keep the pupil centred in the image. When a tracking error is observed, feedback provided by a second camera, in this case a wide field-of-view (WFV) camera, allows quick recovery of the tracking process. Illumination is provided by four infrared LED blocks synchronised with the electronic shutter of the eye camera. The characteristic shape of corneal glints produced by these illuminators allows optimizing the image processing algorithms for gaze detection developed for this system. The illumination power used in this system has been limited to well below maximum recommended levels. After an initial calibration procedure, the line of gaze is determined starting from the vector defined by the pupil centre and a valid glint. The glints are validated using the iris outline to avoid glint distortion produced by changes in the curvature on the ocular globe. In order to minimize measurement error in the pupil-glint vector, algorithms are proposed to determine the pupil centre at sub-pixel resolution. Although the paper describes a desk-mounted prototype, the final implementation is to be installed on board of a conventional car as an embedded system to determine the line of gaze of the driver

    Detecting Gaze Direction Using Robot-Mounted and Mobile-Device Cameras

    Get PDF
    Two common channels through which humans communicate are speech andgaze. Eye gaze is an important mode of communication: it allows people tobetter understand each others’ intentions, desires, interests, and so on. The goalof this research is to develop a framework for gaze triggered events which canbe executed on a robot and mobile devices and allows to perform experiments.We experimentally evaluate the framework and techniques for extracting gazedirection based on a robot-mounted camera or a mobile-device camera whichare implemented in the framework. We investigate the impact of light on theaccuracy of gaze estimation, and also how the overall accuracy depends on usereye and head movements. Our research shows that the light intensity is im-portant, and the placement of light source is crucial. All the robot-mountedgaze detection modules we tested were found to be similar with regard to ac-curacy. The framework we developed was tested in a human-robot interactionexperiment involving a job-interview scenario. The flexible structure of thisscenario allowed us to test different components of the framework in variedreal-world scenarios, which was very useful for progressing towards our long-term research goal of designing intuitive gaze-based interfaces for human robotcommunication

    Unobtrusive and pervasive video-based eye-gaze tracking

    Get PDF
    Eye-gaze tracking has long been considered a desktop technology that finds its use inside the traditional office setting, where the operating conditions may be controlled. Nonetheless, recent advancements in mobile technology and a growing interest in capturing natural human behaviour have motivated an emerging interest in tracking eye movements within unconstrained real-life conditions, referred to as pervasive eye-gaze tracking. This critical review focuses on emerging passive and unobtrusive video-based eye-gaze tracking methods in recent literature, with the aim to identify different research avenues that are being followed in response to the challenges of pervasive eye-gaze tracking. Different eye-gaze tracking approaches are discussed in order to bring out their strengths and weaknesses, and to identify any limitations, within the context of pervasive eye-gaze tracking, that have yet to be considered by the computer vision community.peer-reviewe

    Fast and Accurate Algorithm for Eye Localization for Gaze Tracking in Low Resolution Images

    Full text link
    Iris centre localization in low-resolution visible images is a challenging problem in computer vision community due to noise, shadows, occlusions, pose variations, eye blinks, etc. This paper proposes an efficient method for determining iris centre in low-resolution images in the visible spectrum. Even low-cost consumer-grade webcams can be used for gaze tracking without any additional hardware. A two-stage algorithm is proposed for iris centre localization. The proposed method uses geometrical characteristics of the eye. In the first stage, a fast convolution based approach is used for obtaining the coarse location of iris centre (IC). The IC location is further refined in the second stage using boundary tracing and ellipse fitting. The algorithm has been evaluated in public databases like BioID, Gi4E and is found to outperform the state of the art methods.Comment: 12 pages, 10 figures, IET Computer Vision, 201

    LIMBUSTRACK: STABLE EYE-TRACKING IN IMPERFECT LIGHT CONDITIONS

    Get PDF
    We are aware of only one serious effort at development of a cheap, accurate, wearable eye tracker: the open source openEyes project. However, its method of ocular feature detection is such that it is prone to failure in variable lighting conditions. To address this deficiency, we have developed a cheap wearable eye tracker. At the heart of our development are novel techniques that allow operation under variable illumination

    Robust Head Mounted Wearable Eye Tracking System for Dynamical Calibration

    Get PDF
    In this work, a new head mounted eye tracking system is presented. Based on computer vision techniques, the system integrates eye images and head movement, in real time, performing a robust gaze point tracking. Nystagmus movements due to vestibulo-ocular reflex are monitored and integrated. The system proposed here is a strongly improved version of a previous platform called HATCAM, which was robust against changes of illumination conditions. The new version, called HAT-Move, is equipped with accurate inertial motion unit to detect the head movement enabling eye gaze even in dynamical conditions. HAT-Move performance is investigated in a group of healthy subjects in both static and dynamic conditions, i.e. when head is kept still or free to move. Evaluation was performed in terms of amplitude of the angular error between the real coordinates of the fixed points and those computed by the system in two experimental setups, specifically, in laboratory settings and in a 3D virtual reality (VR) scenario. The achieved results showed that HAT-Move is able to achieve eye gaze angular error of about 1 degree along both horizontal and vertical direction
    corecore