113 research outputs found

    Geometry Issues of Gaze Estimation

    Get PDF

    Precise Non-Intrusive Real-Time Gaze Tracking System for Embedded Setups

    Get PDF
    This paper describes a non-intrusive real-time gaze detection system, characterized by a precise determination of a subject's pupil centre. A narrow field-of-view camera (NFV), focused on one of the subject's eyes follows the head movements in order to keep the pupil centred in the image. When a tracking error is observed, feedback provided by a second camera, in this case a wide field-of-view (WFV) camera, allows quick recovery of the tracking process. Illumination is provided by four infrared LED blocks synchronised with the electronic shutter of the eye camera. The characteristic shape of corneal glints produced by these illuminators allows optimizing the image processing algorithms for gaze detection developed for this system. The illumination power used in this system has been limited to well below maximum recommended levels. After an initial calibration procedure, the line of gaze is determined starting from the vector defined by the pupil centre and a valid glint. The glints are validated using the iris outline to avoid glint distortion produced by changes in the curvature on the ocular globe. In order to minimize measurement error in the pupil-glint vector, algorithms are proposed to determine the pupil centre at sub-pixel resolution. Although the paper describes a desk-mounted prototype, the final implementation is to be installed on board of a conventional car as an embedded system to determine the line of gaze of the driver

    Models for gaze tracking systems

    Get PDF
    One of the most confusing aspects that one meets when introducing oneself into gaze tracking technology is the wide variety, in terms of hardware equipment, of available systems that provide solutions to the same matter, that is, determining the point the subject is looking at. The calibration process permits generally adjusting nonintrusive trackers based on quite different hardware and image features to the subject. The negative aspect of this simple procedure is that it permits the system to work properly but at the expense of a lack of control over the intrinsic behavior of the tracker. The objective of the presented article is to overcome this obstacle to explore more deeply the elements of a video-oculographic system, that is, eye, camera, lighting, and so forth, from a purely mathematical and geometrical point of view. The main contribution is to find out the minimum number of hardware elements and image features that are needed to determine the point the subject is looking at. A model has been constructed based on pupil contour and multiple lighting, and successfully tested with real subjects. On the other hand, theoretical aspects of video-oculographic systems have been thoroughly reviewed in order to build a theoretical basis for further studies

    An easy iris center detection method for eye gaze tracking system

    Get PDF
    Iris center detection accuracy has great impact on eye gaze tracking system performance. This paper proposes an easy and efficient iris center detection method based on modeling the geometric relationship between the detected rough iris center and the two corners of the eye. The method fully considers four states of iris within the eye region, i.e. center, left, right, and upper. The proposed active edge detection algorithm is utilized to extract iris edge points for ellipse fitting. In addition, this paper also presents a predicted edge point algorithm to solve the decrease in ellipse fitting accuracy, when part of the iris becomes hidden from rolling into a nasal or temporal eye corner. The evaluated result of the method on our eye database shows the global average accuracy of 94.3%. Compared with existing methods, our method achieves the highest iris center detection accuracy. Additionally, in order to test the performance of the proposed method in gaze tracking, this paper presents the results of gaze estimation achieved by our eye gaze tracking system

    A Review and Analysis of Eye-Gaze Estimation Systems, Algorithms and Performance Evaluation Methods in Consumer Platforms

    Full text link
    In this paper a review is presented of the research on eye gaze estimation techniques and applications, that has progressed in diverse ways over the past two decades. Several generic eye gaze use-cases are identified: desktop, TV, head-mounted, automotive and handheld devices. Analysis of the literature leads to the identification of several platform specific factors that influence gaze tracking accuracy. A key outcome from this review is the realization of a need to develop standardized methodologies for performance evaluation of gaze tracking systems and achieve consistency in their specification and comparative evaluation. To address this need, the concept of a methodological framework for practical evaluation of different gaze tracking systems is proposed.Comment: 25 pages, 13 figures, Accepted for publication in IEEE Access in July 201

    The human eye as human-machine interface

    Get PDF
    Eye tracking as an interface to operate a computer is under research for a while and new systems are still being developed nowadays that provide some encouragement to those bound to illnesses that incapacitates them to use any other form of interaction with a computer. Although using computer vision processing and a camera, these systems are usually based on head mount technology being considered a contact type system. This paper describes the implementation of a human-computer interface based on a fully non-contact eye tracking vision system in order to allow people with tetraplegia to interface with a computer. As an assistive technology, a graphical user interface with special features was developed including a virtual keyboard to allow user communication, fast access to pre-stored phrases and multimedia and even internet browsing. This system was developed with the focus on low cost, user friendly functionality and user independency and autonomy.The authors would like to thank the important contributions of Mr. Abel, his wife and Mr. Sampaio for the success of this work. This work was supported by the Automation and Robotics Laboratory from the Algoritmi Research Center at the University of Minho in Guimaraes. This work is funded by FEDER through the Operational Competitiveness Programme — COMPETE — and by national funds through the Foundation for Science and Technology — FCT — in the scope of project: FCOMP-01-0124-FEDER-022674
    • …
    corecore