36,576 research outputs found

    A Real-Time Video-based Eye Tracking Approach for Driver Attention Study

    Get PDF
    nowing the driver's point of gaze has significant potential to enhance driving safety, eye movements can be used as an indicator of the attention state of a driver; but the primary obstacle of integrating eye gaze into today's large scale real world driving attention study is the availability of a reliable, low-cost eye-tracking system. In this paper, we make an attempt to investigate such a real-time system to collect driver's eye gaze in real world driving environment. A novel eye-tracking approach is proposed based on low cost head mounted eye tracker. Our approach detects corneal reflection and pupil edge points firstly, and then fits the points with ellipse. The proposed approach is available in different illumination and driving environment from simple inexpensive head mounted eye tracker, which can be widely used in large scale experiments. The experimental results illustrate our approach can reliably estimate eye position with an accuracy of average 0.34 degree of visual angle in door experiment and 2--5 degrees in real driving environments

    Owl and Lizard: Patterns of Head Pose and Eye Pose in Driver Gaze Classification

    Full text link
    Accurate, robust, inexpensive gaze tracking in the car can help keep a driver safe by facilitating the more effective study of how to improve (1) vehicle interfaces and (2) the design of future Advanced Driver Assistance Systems. In this paper, we estimate head pose and eye pose from monocular video using methods developed extensively in prior work and ask two new interesting questions. First, how much better can we classify driver gaze using head and eye pose versus just using head pose? Second, are there individual-specific gaze strategies that strongly correlate with how much gaze classification improves with the addition of eye pose information? We answer these questions by evaluating data drawn from an on-road study of 40 drivers. The main insight of the paper is conveyed through the analogy of an "owl" and "lizard" which describes the degree to which the eyes and the head move when shifting gaze. When the head moves a lot ("owl"), not much classification improvement is attained by estimating eye pose on top of head pose. On the other hand, when the head stays still and only the eyes move ("lizard"), classification accuracy increases significantly from adding in eye pose. We characterize how that accuracy varies between people, gaze strategies, and gaze regions.Comment: Accepted for Publication in IET Computer Vision. arXiv admin note: text overlap with arXiv:1507.0476

    Video surveillance for monitoring driver's fatigue and distraction

    Get PDF
    Fatigue and distraction effects in drivers represent a great risk for road safety. For both types of driver behavior problems, image analysis of eyes, mouth and head movements gives valuable information. We present in this paper a system for monitoring fatigue and distraction in drivers by evaluating their performance using image processing. We extract visual features related to nod, yawn, eye closure and opening, and mouth movements to detect fatigue as well as to identify diversion of attention from the road. We achieve an average of 98.3% and 98.8% in terms of sensitivity and specificity for detection of driver's fatigue, and 97.3% and 99.2% for detection of driver's distraction when evaluating four video sequences with different drivers
    • …
    corecore