33,587 research outputs found

    A Review on Implementation of Real Time Image Processing for a Human Eye Computer Interaction System

    Get PDF
    People with physical disabilities cannot fully enjoy the benefits provided by computer System .This is because the conventional mouse and keyboard were designed to be used by those who are able bodied. Due to reducing the communication barriers between man and machine human eye computer interaction is important. The main aim of this proposed system is to design and implement a human computer interaction system that tracks the direction of the human gaze. The pupil detection and tracking is an important step for developing a human-computer interaction system. To identify the gaze direction of the user’s eye (right, left, up and down). Human eye uses contactless type devices. This work can develop a human computer interaction system that is based on iris tracking. The iris is widely used as the starting point for detection and tracking. It is an important eye feature that is circular in shape and that can be detected easily

    Implementation of Real Time Image Processing for a Human Eye Computer Interaction System

    Get PDF
    People with physical disabilities cannot fully enjoy the benefits provided by computer System. This is because the conventional mouse and keyboard were designed to be used by those who are able bodied. A number of barriers have stood in the way of the integration of eye tracking into everyday applications, including the intrusiveness, robustness, availability, and price of eye-tracking systems. Due to reducing the communication barriers between man and machine human eye computer interaction is important. The goal of this thesis is to lower these barriers so that eye tracking can be used to enhance current human computer interfaces.The main aim of this proposed system is to design and implement a human computer interaction system that tracks the direction of the human gaze. The pupil detection and tracking is an important step for developing a human-computer interaction system. To identify the gaze direction of the user’s eye (right, left, up and down).This work can develop a human computer interaction system that is based on iris tracking.A novel idea to control computer mouse cursor movement with human eyes it controls mouse-moving by automatically affecting the position where eyesight focuses on, and simulates mouse-click by affecting blinking action

    Real-Time Gaze Tracking with a Consumer-Grade Video Camera

    Get PDF
    Eye gaze can be a rich source of information to identify particular interests of human users. Eye gaze tracking has been largely used in different research areas in the last years, as for example in psychology, visual system design and to leverage the user interaction with computer systems. In this paper, we present an IR-based gaze tracking framework that can be easily coupled to common user applications and allows for real-time gaze estimation. Compared to other gaze tracking systems, our system uses only affordable consumer-grade hardware and still achieves fair accuracy. To evaluate the usability of our gaze tracking system, we performed a user study with persons of different genders and ethnicities

    Eye Gaze Tracking for Human Computer Interaction

    Get PDF
    With a growing number of computer devices around us, and the increasing time we spend for interacting with such devices, we are strongly interested in finding new interaction methods which ease the use of computers or increase interaction efficiency. Eye tracking seems to be a promising technology to achieve this goal. This thesis researches interaction methods based on eye-tracking technology. After a discussion of the limitations of the eyes regarding accuracy and speed, including a general discussion on Fitts’ law, the thesis follows three different approaches on how to utilize eye tracking for computer input. The first approach researches eye gaze as pointing device in combination with a touch sensor for multimodal input and presents a method using a touch sensitive mouse. The second approach examines people’s ability to perform gestures with the eyes for computer input and the separation of gaze gestures from natural eye movements. The third approach deals with the information inherent in the movement of the eyes and its application to assist the user. The thesis presents a usability tool for recording of interaction and gaze activity. It also describes algorithms for reading detection. All approaches present results based on user studies conducted with prototypes developed for the purpose

    Pupil Position by an Improved Technique of YOLO Network for Eye Tracking Application

    Get PDF
    This Eye gaze following is the real-time collection of information about a person's eye movements and the direction of their look. Eye gaze trackers are devices that measure the locations of the pupils to detect and track changes in the direction of the user's gaze. There are numerous applications for analyzing eye movements, from psychological studies to human-computer interaction-based systems and interactive robotics controls. Real-time eye gaze monitoring requires an accurate and reliable iris center localization technique. Deep learning technology is used to construct a pupil tracking approach for wearable eye trackers in this study. This pupil tracking method uses deep-learning You Only Look Once (YOLO) model to accurately estimate and anticipate the pupil's central location under conditions of bright, natural light (visible to the naked eye). Testing pupil tracking performance with the upgraded YOLOv7 results in an accuracy rate of 98.50% and a precision rate close to 96.34% using PyTorch

    GazeDrone: Mobile Eye-Based Interaction in Public Space Without Augmenting the User

    Get PDF
    Gaze interaction holds a lot of promise for seamless human-computer interaction. At the same time, current wearable mobile eye trackers require user augmentation that negatively impacts natural user behavior while remote trackers require users to position themselves within a confined tracking range. We present GazeDrone, the first system that combines a camera-equipped aerial drone with a computational method to detect sidelong glances for spontaneous (calibration-free) gaze-based interaction with surrounding pervasive systems (e.g., public displays). GazeDrone does not require augmenting each user with on-body sensors and allows interaction from arbitrary positions, even while moving. We demonstrate that drone-supported gaze interaction is feasible and accurate for certain movement types. It is well-perceived by users, in particular while interacting from a fixed position as well as while moving orthogonally or diagonally to a display. We present design implications and discuss opportunities and challenges for drone-supported gaze interaction in public
    • …
    corecore