29 research outputs found

    A Review and Analysis of Eye-Gaze Estimation Systems, Algorithms and Performance Evaluation Methods in Consumer Platforms

    Full text link
    In this paper a review is presented of the research on eye gaze estimation techniques and applications, that has progressed in diverse ways over the past two decades. Several generic eye gaze use-cases are identified: desktop, TV, head-mounted, automotive and handheld devices. Analysis of the literature leads to the identification of several platform specific factors that influence gaze tracking accuracy. A key outcome from this review is the realization of a need to develop standardized methodologies for performance evaluation of gaze tracking systems and achieve consistency in their specification and comparative evaluation. To address this need, the concept of a methodological framework for practical evaluation of different gaze tracking systems is proposed.Comment: 25 pages, 13 figures, Accepted for publication in IEEE Access in July 201

    Probabilistic Approach to Robust Wearable Gaze Tracking

    Get PDF
    Creative Commons Attribution License (CC BY 4.0)This paper presents a method for computing the gaze point using camera data captured with a wearable gaze tracking device. The method utilizes a physical model of the human eye, ad- vanced Bayesian computer vision algorithms, and Kalman filtering, resulting in high accuracy and low noise. Our C++ implementation can process camera streams with 30 frames per second in realtime. The performance of the system is validated in an exhaustive experimental setup with 19 participants, using a self-made device. Due to the used eye model and binocular cam- eras, the system is accurate for all distances and invariant to device movement. We also test our system against a best-in-class commercial device which is outperformed for spatial accuracy and precision. The software and hardware instructions as well as the experimental data are pub- lished as open source.Peer reviewe

    Three-dimensional non-parametric method for limbus detection

    Get PDF
    Purpose To present a novel non-parametric algorithm for detecting the position of the human eye limbus in three dimensions and a new dynamic method for measuring the full 360° visible iris boundary known as white-to-white distance along the eye horizontal line. Methods The study included 88 participants aged 23 to 65 years (37.7±9.7), 47 females and 41 males. Clinical characteristics, height data and the apex coordinates and 1024×1280 pixel digital images of the eyes were taken by an Eye Surface Profiler and processed by custom-built MATLAB codes. A dynamic light intensity frequency based white-to-white detection process and a novel three-dimensional method for limbus detection is presented. Results Evidence of significant differences (p<0.001) between nasal-temporal and superior-inferior white-to-white distances in both right and left eyes were found (nasal-temporal direction; 11.74±0.42 mm in right eyes and 11.82±0.47 mm in left eyes & superior-inferior direction; 11.52±0.45 mm in right eyes and 11.55±0.46 mm in left eyes). Average limbus nasal-temporal diameters were 13.64±0.55 mm for right eyes, and 13.74±0.40 mm for left eyes, however the superior-inferior diameters were 13.65±0.54 mm, 13.75±0.38 mm for right and left eyes, respectively. No significant difference in limbus contours has been observed either between the nasal-temporal direction (p = 0.91) and the superior-inferior direction (p = 0.83) or between the right (p = 0.18) and left eyes (p = 0.16). Evidence of tilt towards the nasal-temporal side in the three-dimensional shape of the limbus was found. The right eyes mean limbus contour tilt around the X-axis was -0.3±1.35° however, their mean limbus contour tilt around the Y-axis was 1.76±0.9°. Likewise, the left eyes mean limbus contour tilt around the X-axis was 0.77±1.25° and the mean limbus contour tilt around the Y-axis was -1.54±0.89°. Conclusions The white-to-white distance in the human eye is significantly larger in the nasal-temporal direction than in the superior-inferior direction. The human limbus diameter was found not to vary significantly in these directions. The 3D measures show that the limbus contour does not lay in one plane and tends to be higher on the nasal-inferior side of the eye

    Eye tracking: empirical foundations for a minimal reporting guideline

    Get PDF
    In this paper, we present a review of how the various aspects of any study using an eye tracker (such as the instrument, methodology, environment, participant, etc.) affect the quality of the recorded eye-tracking data and the obtained eye-movement and gaze measures. We take this review to represent the empirical foundation for reporting guidelines of any study involving an eye tracker. We compare this empirical foundation to five existing reporting guidelines and to a database of 207 published eye-tracking studies. We find that reporting guidelines vary substantially and do not match with actual reporting practices. We end by deriving a minimal, flexible reporting guideline based on empirical research (Section "empirically based minimal reporting guideline")

    Towards Energy Efficient Mobile Eye Tracking for AR Glasses through Optical Sensor Technology

    Get PDF
    After the introduction of smartphones and smartwatches, Augmented Reality (AR) glasses are considered the next breakthrough in the field of wearables. While the transition from smartphones to smartwatches was based mainly on established display technologies, the display technology of AR glasses presents a technological challenge. Many display technologies, such as retina projectors, are based on continuous adaptive control of the display based on the user’s pupil position. Furthermore, head-mounted systems require an adaptation and extension of established interaction concepts to provide the user with an immersive experience. Eye-tracking is a crucial technology to help AR glasses achieve a breakthrough through optimized display technology and gaze-based interaction concepts. Available eye-tracking technologies, such as Video Oculography (VOG), do not meet the requirements of AR glasses, especially regarding power consumption, robustness, and integrability. To further overcome these limitations and push mobile eye-tracking for AR glasses forward, novel laser-based eye-tracking sensor technologies are researched in this thesis. The thesis contributes to a significant scientific advancement towards energy-efficientmobile eye-tracking for AR glasses. In the first part of the thesis, novel scanned laser eye-tracking sensor technologies for AR glasses with retina projectors as display technology are researched. The goal is to solve the disadvantages of VOG systems and to enable robust eye-tracking and efficient ambient light and slippage through optimized sensing methods and algorithms. The second part of the thesis researches the use of static Laser Feedback Interferometry (LFI) sensors as low power always-on sensor modality for detection of user interaction by gaze gestures and context recognition through Human Activity Recognition (HAR) for AR glasses. The static LFI sensors can measure the distance to the eye and the eye’s surface velocity with an outstanding sampling rate. Furthermore, they offer high integrability regardless of the display technology. In the third part of the thesis, a model-based eye-tracking approach is researched based on the static LFI sensor technology. The approach leads to eye-tracking with an extremely high sampling rate by fusing multiple LFI sensors, which enables methods for display resolution enhancement such as foveated rendering for AR glasses and Virtual Reality (VR) systems. The scientific contributions of this work lead to a significant advance in the field of mobile eye-tracking for AR glasses through the introduction of novel sensor technologies that enable robust eye tracking in uncontrolled environments in particular. Furthermore, the scientific contributions of this work have been published in internationally renowned journals and conferences

    Social gaze

    Get PDF

    Implementation of new assistive technologies for people affected by Autistic Spectrum Disorders (ASDs)

    Get PDF
    Individuals with Autistic Spectrum Disorders (ASDs) have impairments in the processing of social and emotional information. The number of children known to have autism has increased dramatically since the 1980s. This has sensitized the scienti¯c community to the design and development of technologies suitable for treating an autistic patient in order to broaden the emotive responsiveness, such as the employment of robotic systems to engage proactive interactive responses in children with ASDs. My PhD work focuses on the design and develop of new technologies for therapy with individual affect by ASD. The main challenge of my work has been to design and develop a novel control architecture able to reproduce the brain characteristics in terms of high concurrency processing, flexibility and the ability to learn new behavior. The main di±culties in implementing Artificial Neural Networks (ANNs) in hardware in terms of accuracy, gate complexity and speed performance are discussed. A new wearable eye tracking system able to investigate attention disorders early in infancy is proposed. Technological choices are emphasized with respect to unobtrusive and ecological to adapt the device for infants. New algorithms to increase the system robustness under illumination change and during calibration process have been developed and herein presented. Experimental test prove the effectiveness of the solutions. Considerations on the future research directions are addressed, stressing the multiple application fields of the designed device
    corecore