2,147 research outputs found

    Unobtrusive and pervasive video-based eye-gaze tracking

    Get PDF
    Eye-gaze tracking has long been considered a desktop technology that finds its use inside the traditional office setting, where the operating conditions may be controlled. Nonetheless, recent advancements in mobile technology and a growing interest in capturing natural human behaviour have motivated an emerging interest in tracking eye movements within unconstrained real-life conditions, referred to as pervasive eye-gaze tracking. This critical review focuses on emerging passive and unobtrusive video-based eye-gaze tracking methods in recent literature, with the aim to identify different research avenues that are being followed in response to the challenges of pervasive eye-gaze tracking. Different eye-gaze tracking approaches are discussed in order to bring out their strengths and weaknesses, and to identify any limitations, within the context of pervasive eye-gaze tracking, that have yet to be considered by the computer vision community.peer-reviewe

    A best view selection in meetings through attention analysis using a multi-camera network

    Get PDF
    Human activity analysis is an essential task in ambient intelligence and computer vision. The main focus lies in the automatic analysis of ongoing activities from a multi-camera network. One possible application is meeting analysis which explores the dynamics in meetings using low-level data and inferring high-level activities. However, the detection of such activities is still very challenging due to the often corrupted or imprecise low-level data. In this paper, we present an approach to understand the dynamics in meetings using a multi-camera network, consisting of fixed ambient and portable close-up cameras. As a particular application we are aiming to find the most informative video stream, for example as a representative view for a remote participant. Our contribution is threefold: at first, we estimate the extrinsic parameters of the portable close-up cameras based on head positions. Secondly, we find common overlapping areas based on the consensus of people’s orientation. And thirdly, the most informative view for a remote participant is estimated using common overlapping areas. We evaluated our proposed approach and compared it to a motion estimation method. Experimental results show that we can reach an accuracy of 74% compared to manually selected views

    GAZE ESTIMATION USING SCLERA AND IRIS EXTRACTION

    Get PDF
    Tracking gaze of an individual provides important information in understanding the behavior of that person. Gaze tracking has been widely used in a variety of applications from tracking consumers gaze fixation on advertisements, controlling human-computer devices, to understanding behaviors of patients with various types of visual and/or neurological disorders such as autism. Gaze pattern can be identified using different methods but most of them require the use of specialized equipments which can be prohibitively expensive for some applications. In this dissertation, we investigate the possibility of using sclera and iris regions captured in a webcam sequence to estimate gaze pattern. The sclera and iris regions in the video frame are first extracted by using an adaptive thresholding technique. The gaze pattern is then determined based on areas of different sclera and iris regions and distances between tracked points along the irises. The technique is novel as sclera regions are often ignored in eye tracking literature while we have demonstrated that they can be easily extracted from images captured by low-cost camera and are useful in determining the gaze pattern. The accuracy and computational efficiency of the proposed technique is demonstrated by experiments with human subjects

    Pupil Position by an Improved Technique of YOLO Network for Eye Tracking Application

    Get PDF
    This Eye gaze following is the real-time collection of information about a person's eye movements and the direction of their look. Eye gaze trackers are devices that measure the locations of the pupils to detect and track changes in the direction of the user's gaze. There are numerous applications for analyzing eye movements, from psychological studies to human-computer interaction-based systems and interactive robotics controls. Real-time eye gaze monitoring requires an accurate and reliable iris center localization technique. Deep learning technology is used to construct a pupil tracking approach for wearable eye trackers in this study. This pupil tracking method uses deep-learning You Only Look Once (YOLO) model to accurately estimate and anticipate the pupil's central location under conditions of bright, natural light (visible to the naked eye). Testing pupil tracking performance with the upgraded YOLOv7 results in an accuracy rate of 98.50% and a precision rate close to 96.34% using PyTorch

    Real-time estimation of horizontal gaze angle by saccade integration using in-ear electrooculography

    Get PDF
    The manuscript proposes and evaluates a real-time algorithm for estimating eye gaze angle based solely on single-channel electrooculography (EOG), which can be obtained directly from the ear canal using conductive ear moulds. In contrast to conventional high-pass filtering, we used an algorithm that calculates absolute eye gaze angle via statistical analysis of detected saccades. The estimated eye positions of the new algorithm were still noisy. However, the performance in terms of Pearson product-moment correlation coefficients was significantly better than the conventional approach in some instances. The results suggest that in-ear EOG signals captured with conductive ear moulds could serve as a basis for lightweight and portable horizontal eye gaze angle estimation suitable for a broad range of applications. For instance, for hearing aids to steer the directivity of microphones in the direction of the user’s eye gaze

    Using smooth pursuit calibration for difficult-to-calibrate participants

    Get PDF
    Although the 45-dots calibration routine of a previous study (Blignaut, 2016) provided very good accuracy, it requires intense mental effort and the routine proved to be unsuccessful for young children who struggle to maintain concentration. The calibration procedures that are normally used for difficult-to-calibrate participants, such as autistic children and infants, do not suffice since they are not accurate enough and the reliability of research results might be jeopardised.Smooth pursuit has been used before for calibration and is applied in this paper as an alternative routine for participants who are difficult to calibrate with conventional routines.  Gaze data is captured at regular intervals and many calibration targets are generated while the eyes are following a moving target. The procedure could take anything between 30 s and 60 s to complete, but since an interesting target and/or a conscious task may be used, participants are assisted to maintain concentration.It was proven that the accuracy that can be attained through calibration with a moving target along an even horizontal path is not significantly worse than the accuracy that can be attained with a standard method of watching dots appearing in random order. The routine was applied successfully for a group of children with ADD, ADHD and learning abilities.This result is important as it provides for easier calibration – especially in the case of participants who struggle to keep their gaze focused and stable on a stationary target for long enough
    • …
    corecore