32,054 research outputs found

    Gaze Controlled Human-Computer Interface

    Get PDF
    The goal of the Gaze Controlled Human Computer Interface project is to design and construct a non-invasive gaze-tracking system that will determine where a user is looking on a computer screen in real time. To accomplish this, a fixed illumination source consisting of Infrared (IR) Light Emitting Diodes (LEDs) is used to produce corneal reflections on the user’s eyes. These reflections are captured with a video camera and compared to the relative location of the user’s pupils. From this comparison, a correlation matrix can be created and the approximate location of the screen that the user is looking at can be determined. The final objective is to allow the user to manipulate a cursor on the computer screen simply by looking at different boxes in a grid on the monitor. The project includes design of the hardware setup to provide a suitable environment for glint detection, image processing of the user’s eyes to determine pupil location, the implementation of a probabilistic algorithm to determine an appropriate matrix transformation, and performance analysis on various users

    Manoeuvring drone (Tello Talent) using eye gaze and or fingers gestures

    Get PDF
    The project aims to combine hands and eyes to control a Tello Talent drone based on computer vision, machine learning and an eye tracking device for gaze detection and interaction. The main purpose of this project is gaming, experimental and educational for next coming generation, in addition it is very useful for the peoples who cannot use their hands, they can maneuver the drone by their eyes movement, and hopefully this will bring them some fun. The idea of this project is inspired by the progress and development in the innovative technologies such as machine learning, computer vision and object detection that offer a large field of applications which can be used in diverse domains, there are many researcher are improving, instructing and innovating the new intelligent manner for controlling the drones by combining computer vision, machine learning, artificial intelligent, etc. This project can help anyone even the people who they donÂżt have any prior knowledge of programming or Computer Vision or theory of eye tracking system, they learn the basic knowledge of drone concept, object detection, programing, and integrating different hardware and software involved, then playing. As a final objective, they can able to build simple application that can control the drones by using movements of hands, eyes or both, during the practice they should take in consideration the operating condition and safety required by the manufacturers of drones and eye tracking device. The concept of Tello Talent drone is based on a series of features, functions and scripts which are already been developed, embedded in autopilot memories and are accessible by users via an SDK protocol. The SDK is used as an easy guide to developing simple and complex applications; it allows the user to develop several flying mission programs. There are different experiments were studied for checking which scenario is better in detecting the hands movement and exploring the keys points in real-time with low computing power computer. As a result, I find that the Google artificial intelligent research group offers an open source platform dedicated for developing this application; the platform is called MediaPipe based on customizable machine learning solution for live streaming video. In this project the MediaPipe and the eye tracking module are the fundamental tools for developing and realizing the application

    Unobtrusive and pervasive video-based eye-gaze tracking

    Get PDF
    Eye-gaze tracking has long been considered a desktop technology that finds its use inside the traditional office setting, where the operating conditions may be controlled. Nonetheless, recent advancements in mobile technology and a growing interest in capturing natural human behaviour have motivated an emerging interest in tracking eye movements within unconstrained real-life conditions, referred to as pervasive eye-gaze tracking. This critical review focuses on emerging passive and unobtrusive video-based eye-gaze tracking methods in recent literature, with the aim to identify different research avenues that are being followed in response to the challenges of pervasive eye-gaze tracking. Different eye-gaze tracking approaches are discussed in order to bring out their strengths and weaknesses, and to identify any limitations, within the context of pervasive eye-gaze tracking, that have yet to be considered by the computer vision community.peer-reviewe

    TransparentHMD: Revealing the HMD User's Face to Bystanders

    Get PDF
    While the eyes are very important in human communication, once a user puts on a head mounted display (HMD), the face is obscured from the outside world's perspective. This leads to communication problems when bystanders approach or collaborate with an HMD user. We introduce transparentHMD, which employs a head-coupled perspective technique to produce an illusion of a transparent HMD to bystanders. We created a self contained system, based on a mobile device mounted on the HMD with the screen facing bystanders. By tracking the relative position of the bystander using the smartphone's camera, we render an adapting perspective view in realtime that creates the illusion of a transparent HMD. By revealing the user's face to bystanders, our easy to implement system allows for opportunities to investigate a plethora of research questions particularly related to collaborative VR systems

    GazeTouchPass: Multimodal Authentication Using Gaze and Touch on Mobile Devices

    Get PDF
    We propose a multimodal scheme, GazeTouchPass, that combines gaze and touch for shoulder-surfing resistant user authentication on mobile devices. GazeTouchPass allows passwords with multiple switches between input modalities during authentication. This requires attackers to simultaneously observe the device screen and the user's eyes to find the password. We evaluate the security and usability of GazeTouchPass in two user studies. Our findings show that GazeTouchPass is usable and significantly more secure than single-modal authentication against basic and even advanced shoulder-surfing attacks
    • …
    corecore