24,135 research outputs found

    Retinally stabilized differential resolution television display

    Get PDF
    A remote television viewing system employing an eye tracker is disclosed, wherein a small region of the image appears in high resolution, and the remainder of the image appears in low resolution. The eye tracker monitors the position of the viewer's line of sight. The eye tracker position data is transmitted to the remote television camera and control. Both the remote camera and television display are adapted to have selectable high-resolution and low resolution raster scan modes. The position data from the eye tracker is used to determine the point at which the high-resolution scan is to commence. The video data defining the observed image is encoded in a novel format, wherein in each data field, the data representing the position of the high resolution region of predetermined size appears first, followed by the high resolution zone video data and then the low-resolution region data. As the viewer's line of sight relative to the displayed image changes, the position of the high resolution region changes to track the viewer's line of sight

    Experimental study of visual accommodation Final report

    Get PDF
    Visual accommodation experimental studies, with optometer, visual display unit, and eye tracker instrumentation developmen

    Eye Tracker Accuracy: Quantitative Evaluation of the Invisible Eye Center Location

    Full text link
    Purpose. We present a new method to evaluate the accuracy of an eye tracker based eye localization system. Measuring the accuracy of an eye tracker's primary intention, the estimated point of gaze, is usually done with volunteers and a set of fixation points used as ground truth. However, verifying the accuracy of the location estimate of a volunteer's eye center in 3D space is not easily possible. This is because the eye center is an intangible point hidden by the iris. Methods. We evaluate the eye location accuracy by using an eye phantom instead of eyes of volunteers. For this, we developed a testing stage with a realistic artificial eye and a corresponding kinematic model, which we trained with {\mu}CT data. This enables us to precisely evaluate the eye location estimate of an eye tracker. Results. We show that the proposed testing stage with the corresponding kinematic model is suitable for such a validation. Further, we evaluate a particular eye tracker based navigation system and show that this system is able to successfully determine the eye center with sub-millimeter accuracy. Conclusions. We show the suitability of the evaluated eye tracker for eye interventions, using the proposed testing stage and the corresponding kinematic model. The results further enable specific enhancement of the navigation system to potentially get even better results

    Wearable computing: Will it make people prosocial?

    Get PDF
    We recently reported that people who wear an eye tracker modify their natural looking behaviour in a prosocial manner. This change in looking behaviour represents a potential concern for researchers who wish to use eye trackers to understand the functioning of human attention. On the other hand, it may offer a real boon to manufacturers and consumers of wearable computing (e.g., Google Glass), for if wearable computing causes people to behave in a prosocial manner, then the public's fear that people with wearable computing will invade their privacy is unfounded. Critically, both of these divergent implications are grounded on the assumption that the prosocial behavioural effect of wearing an eye tracker is sustained for a prolonged period of time. Our study reveals that on the very first wearing of an eye tracker, and in less than 10 min, the prosocial effect of an eye tracker is abolished, but by drawing attention back to the eye tracker, the implied presence effect is easily reactivated. This suggests that eye trackers induce a transient social presence effect, which is rendered dormant when attention is shifted away from the source of implied presence. This is good news for researchers who use eye trackers to measure attention and behaviour; and could be bad news for advocates of wearable computing in everyday life

    A cheap portable eye-tracker solution for common setups

    Get PDF
    We analyze the feasibility of a cheap eye-tracker where the hardware consists of a single webcam and a Raspberry Pi device. Our aim is to discover the limits of such a system and to see whether it provides an acceptable performance. We base our work on the open source Opengazer (Zielinski, 2013) and we propose several improvements to create a robust, real-time system which can work on a computer with 30Hz sampling rate. After assessing the accuracy of our eye-tracker in elaborated experiments involving 12 subjects under 4 different system setups, we install it on a Raspberry Pi to create a portable stand-alone eye-tracker which achieves 1.42° horizontal accuracy with 3Hz refresh rate for a building cost of 70 Euros

    Sampling rate influences saccade detection in mobile eye tracking of a reading task

    Get PDF
    The purpose of this study was to compare saccade detection characteristics in two mobile eye trackers with different sampling rates in a natural task. Gaze data of 11 participants were recorded in one 60 Hz and one 120 Hz mobile eye tracker and compared directly to the saccades detected by a 1000 HZ stationary tracker while a reading task was performed. Saccades and fixations were detected using a velocity based algorithm and their properties analyzed. Results showed that there was no significant difference in the number of detected fixations but mean fixation durations differed between the 60 Hz mobile and the stationary eye tracker. The 120 Hz mobile eye tracker showed a significant increase in the detection rate of saccades and an improved estimation of the mean saccade duration, compared to the 60 Hz eye tracker. To conclude, for the detection and analysis of fast eye movements, such as saccades, it is better to use a 120 Hz mobile eye tracker

    EyeScout: Active Eye Tracking for Position and Movement Independent Gaze Interaction with Large Public Displays

    Get PDF
    While gaze holds a lot of promise for hands-free interaction with public displays, remote eye trackers with their confined tracking box restrict users to a single stationary position in front of the display. We present EyeScout, an active eye tracking system that combines an eye tracker mounted on a rail system with a computational method to automatically detect and align the tracker with the user's lateral movement. EyeScout addresses key limitations of current gaze-enabled large public displays by offering two novel gaze-interaction modes for a single user: In "Walk then Interact" the user can walk up to an arbitrary position in front of the display and interact, while in "Walk and Interact" the user can interact even while on the move. We report on a user study that shows that EyeScout is well perceived by users, extends a public display's sweet spot into a sweet line, and reduces gaze interaction kick-off time to 3.5 seconds -- a 62% improvement over state of the art solutions. We discuss sample applications that demonstrate how EyeScout can enable position and movement-independent gaze interaction with large public displays
    corecore