1 research outputs found

    Interacting with objects in the environment using gaze tracking glasses and speech

    No full text
    This work explores the combination of gaze and speech to interact with objects in the environment. A head-mounted wireless gaze tracker in the form of gaze tracking glasses is used for mobile monitoring of a subject's point of regard on the surrounding environment. In our proposed system, a mobile subject gazes at an object of interest in the environment which opens an interaction window with the object being gazed upon, and a specific interaction command is then given to the object using speech commands. The gaze tracking glasses were made from low-cost hardware consisting of a safety glasses' frame and wireless eye tracking and scene cameras. An open source gaze estimation algorithm is used for eye tracking and user's gaze estimation. The Windows Speech Recognition engine is used for recognition of voice commands. A visual markers recognition library is used to identify objects in the environment through the scene camera. When combining all these elements, the emerging system permits a subject to move freely in an environment, select the object he wants to interact with using gaze (identification) and transmit a control command to it by uttering a speech command (control)
    corecore