1,179 research outputs found

    Unobtrusive and pervasive video-based eye-gaze tracking

    Get PDF
    Eye-gaze tracking has long been considered a desktop technology that finds its use inside the traditional office setting, where the operating conditions may be controlled. Nonetheless, recent advancements in mobile technology and a growing interest in capturing natural human behaviour have motivated an emerging interest in tracking eye movements within unconstrained real-life conditions, referred to as pervasive eye-gaze tracking. This critical review focuses on emerging passive and unobtrusive video-based eye-gaze tracking methods in recent literature, with the aim to identify different research avenues that are being followed in response to the challenges of pervasive eye-gaze tracking. Different eye-gaze tracking approaches are discussed in order to bring out their strengths and weaknesses, and to identify any limitations, within the context of pervasive eye-gaze tracking, that have yet to be considered by the computer vision community.peer-reviewe

    Appearance-Based Gaze Estimation in the Wild

    Full text link
    Appearance-based gaze estimation is believed to work well in real-world settings, but existing datasets have been collected under controlled laboratory conditions and methods have been not evaluated across multiple datasets. In this work we study appearance-based gaze estimation in the wild. We present the MPIIGaze dataset that contains 213,659 images we collected from 15 participants during natural everyday laptop use over more than three months. Our dataset is significantly more variable than existing ones with respect to appearance and illumination. We also present a method for in-the-wild appearance-based gaze estimation using multimodal convolutional neural networks that significantly outperforms state-of-the art methods in the most challenging cross-dataset evaluation. We present an extensive evaluation of several state-of-the-art image-based gaze estimation algorithms on three current datasets, including our own. This evaluation provides clear insights and allows us to identify key research challenges of gaze estimation in the wild

    Detecting Gaze Direction Using Robot-Mounted and Mobile-Device Cameras

    Get PDF
    Two common channels through which humans communicate are speech andgaze. Eye gaze is an important mode of communication: it allows people tobetter understand each others’ intentions, desires, interests, and so on. The goalof this research is to develop a framework for gaze triggered events which canbe executed on a robot and mobile devices and allows to perform experiments.We experimentally evaluate the framework and techniques for extracting gazedirection based on a robot-mounted camera or a mobile-device camera whichare implemented in the framework. We investigate the impact of light on theaccuracy of gaze estimation, and also how the overall accuracy depends on usereye and head movements. Our research shows that the light intensity is im-portant, and the placement of light source is crucial. All the robot-mountedgaze detection modules we tested were found to be similar with regard to ac-curacy. The framework we developed was tested in a human-robot interactionexperiment involving a job-interview scenario. The flexible structure of thisscenario allowed us to test different components of the framework in variedreal-world scenarios, which was very useful for progressing towards our long-term research goal of designing intuitive gaze-based interfaces for human robotcommunication

    GAZE ESTIMATION USING SCLERA AND IRIS EXTRACTION

    Get PDF
    Tracking gaze of an individual provides important information in understanding the behavior of that person. Gaze tracking has been widely used in a variety of applications from tracking consumers gaze fixation on advertisements, controlling human-computer devices, to understanding behaviors of patients with various types of visual and/or neurological disorders such as autism. Gaze pattern can be identified using different methods but most of them require the use of specialized equipments which can be prohibitively expensive for some applications. In this dissertation, we investigate the possibility of using sclera and iris regions captured in a webcam sequence to estimate gaze pattern. The sclera and iris regions in the video frame are first extracted by using an adaptive thresholding technique. The gaze pattern is then determined based on areas of different sclera and iris regions and distances between tracked points along the irises. The technique is novel as sclera regions are often ignored in eye tracking literature while we have demonstrated that they can be easily extracted from images captured by low-cost camera and are useful in determining the gaze pattern. The accuracy and computational efficiency of the proposed technique is demonstrated by experiments with human subjects

    An Optokinetic Nystagmus Detection Method for Use With Young Children

    Get PDF
    Sangi, M., Thompson, B., & Turuwhenua, J. (2015). An Optokinetic Nystagmus Detection Method for Use With Young Children. IEEE Journal of Translational Engineering in Health and Medicine, 3, 1600110. http://doi.org/10.1109/JTEHM.2015.2410286 ©IEEEThe detection of vision problems in early childhood can prevent neurodevelopmental disorders such as amblyopia. However, accurate clinical assessment of visual function in young children is challenging. optokinetic nystagmus (OKN) is a reflexive sawtooth motion of the eye that occurs in response to drifting stimuli, that may allow for objective measurement of visual function in young children if appropriate child-friendly eye tracking techniques are available. In this paper, we present offline tools to detect the presence and direction of the optokinetic reflex in children using consumer grade video equipment. Our methods are tested on video footage of children (N = 5 children and 20 trials) taken as they freely observed visual stimuli that induced horizontal OKN. Using results from an experienced observer as a baseline, we found the sensitivity and specificity of our OKN detection method to be 89.13% and 98.54%, respectively, across all trials. Our OKN detection results also compared well (85%) with results obtained from a clinically trained assessor. In conclusion, our results suggest that OKN presence and direction can be measured objectively in children using consumer grade equipment, and readily implementable algorithms
    corecore