28,845 research outputs found

    Real Time Face-Tracking And Iris Localization

    Get PDF
    Robust, non-intrusive human eye detection problem has been a fundamental and challenging problem for computer vision area. Not only it is a problem of its own, it can be used to ease the problem of finding the locations of other facial features for recognition tasks and human-computer interaction purposes as well. Many previous works have the capability of determining the locations of the human eyes but the main task in this thesis is not only a vision system with eye detection capability. Our aim is to design a real-time, robust, scale-invariant face tracker system with human eye movement indication property using the movements of iris based on localization technique indicate from image processing and circle fitting technique. As a result, our eye tracker system was successfully implemented using non-intrusive webcam with less error

    Real-time eye tracking and iris localization

    Get PDF
    Robust, non-intrusive human eye detection problem has been a fundamental and challenging problem for computer vision area. Not only it is a problem of its own, it can be used to ease the problem of finding the locations of other facial features for recognition tasks and human-computer interaction purposes as well. Many previous works have the capability of determining the locations of the human eyes but the main task in this paper is not only a vision system with eye detection capability.Our aim is to design a real-time face tracker system and iris localization using edge point detection method indicates from image processing and circle fitting technique. As a result, our eye tracker system was successfully implemented using non-intrusive webcam with less error

    The Computer-Controlled Oculometer: A Prototype Interactive Eye Movement Tracking System

    No full text
    One kind of eye movement tracking device which has great potential is the digital computer-controlled Oculometer, an instrument which non-invasively measures point of regard of the subject, as well as pupil diameter and blink occurrence. In conjunction with a computer-generated display which can change in real time as a function of the subject's eye motions, the computer-controlled Oculometer makes possible a variety of interactive measurement and control systems. Practical applications of such schemes have had to await the development of an instrument design which does not inconvenience the subject, and which conveniently interfaces with a digital computer (see ref. 1). This report describes an Oculometer subsystem and an eye-tracking/control program designed for use with the PDP-6 computer of the MIT Project MAC Artificial Intelligence Group. The oculometer electro-optic subsystem utilizes near-infrared light reflected specularly off the front surface of the subject's cornea and diffusely off the retina, producing a bright pupil with an overriding corneal highlight. An electro-optic scanning aperture vidissector within the unit, driven by a digital eye-tracking algorithm programmed into the PDP-6 computer, detects and tracks the centers of the corneal highlight and the bright pupil to give eve movement measurements. A computer-controlled, moving mirror head motion tracker directly coupled to the vidissector tracker permits the subject reasonable freedom of movement. Various applications of this system, which are suggested by the work reported here, include; (a) using the eye as a control device, (b) recording eye fixation and exploring patterns, (c) game playing, (d) training machines, and (e) psychophysiological testing and recording

    EyeScout: Active Eye Tracking for Position and Movement Independent Gaze Interaction with Large Public Displays

    Get PDF
    While gaze holds a lot of promise for hands-free interaction with public displays, remote eye trackers with their confined tracking box restrict users to a single stationary position in front of the display. We present EyeScout, an active eye tracking system that combines an eye tracker mounted on a rail system with a computational method to automatically detect and align the tracker with the user's lateral movement. EyeScout addresses key limitations of current gaze-enabled large public displays by offering two novel gaze-interaction modes for a single user: In "Walk then Interact" the user can walk up to an arbitrary position in front of the display and interact, while in "Walk and Interact" the user can interact even while on the move. We report on a user study that shows that EyeScout is well perceived by users, extends a public display's sweet spot into a sweet line, and reduces gaze interaction kick-off time to 3.5 seconds -- a 62% improvement over state of the art solutions. We discuss sample applications that demonstrate how EyeScout can enable position and movement-independent gaze interaction with large public displays

    Enriching student experience through access to novel technology

    Get PDF

    Towards a human eye behavior model by applying Data Mining Techniques on Gaze Information from IEC

    Get PDF
    In this paper, we firstly present what is Interactive Evolutionary Computation (IEC) and rapidly how we have combined this artificial intelligence technique with an eye-tracker for visual optimization. Next, in order to correctly parameterize our application, we present results from applying data mining techniques on gaze information coming from experiments conducted on about 80 human individuals

    Eye Tracking Consumer Purchase Behavior Within Physical and Virtual Environments

    Get PDF
    Understanding how consumers observe and make purchase decisions within a retail context is now both accessible and efficient through the process of eye tracking. Eye tracking package design aesthetics helps us understand and predict what consumers are looking at, and how likely a package might be selected. Typically, this research is conducted in an immersive retail setting where consumers can shop as they would in a normal store-shopping context. A store is stocked with products where a participant in the study shops throughout while wearing an eye tracker to gather data on what their attention fixates on within a given set of shelves. Although a physical store provides the most realistic context, a virtual store could create a more economical, cost effective, and customizable solution for measuring consumer visual attention from packaging design aesthetics. Beginning with CUshop Consumer Experience Laboratory, a virtual store design and context was established by replicating existing fixtures in CUshopTM. Using the virtual technology available at the Sonoco Institute of Packaging Design and Graphics, a digital replication of CUshopTM was created. This began by 3D modeling the store along with generating the exact content to be displayed using real time rendering software. To investigate the process of measuring consumer attention in each environment, the same study was conducted in both stores looking at shelf performance of eleven different barbecue sauce brands. Gaze data, travel time, purchase decision and presence survey scores from a modified Witmer-Singer survey helped demonstrate the feasibility of gathering valid results from a virtual store context. Results indicated that there was not enough evidence to prove a comparison between the physical and virtual store experiments. Presence scores also did not indicate significant differences between either store environments. Analysis suggests that with a larger participant population and more immersive hardware, such as head mounted displays, eye tracking in virtual stores could be a valid process to complement studies already being conducted in real store contexts

    Adaptive Sampling for Low Latency Vision Processing

    Get PDF
    corecore