Skip to main content
Article thumbnail
Location of Repository

Environmental control by remote eye tracking

By Fangmin Shi and Alastair G. Gale


This is a conference paper. The original version is available online at: movement interfacing can be found in some specially designed environmental control systems (ECSs) for people with severe disability. Typically this requires the user to sit in front of a computer monitor and their eye gaze direction is then detected which controls the cursor position on the screen. The ECS screen usually consists of a number of icons representing different controllable devices and an eye fixation landing within a pre-defined icon area then activates a selection for control. Such systems are widely used in homes, offices, schools, hospitals, and long-term care facilities.\ud Wellings and Unsworth (1997) demonstrated that a user-friendly interface design is the weak link in ECS technology, in particular for severely disabled people. Disabled individuals need straightforward control of their immediate surroundings and so making a detailed menu selection by techniques, such as eye-screen interaction, can be a difficult and tedious process for some individuals. This situation can be exasperated by real-world issues such as eye tracking systems which do not tolerate user’s head movement.\ud This paper presents a different approach to environmental control using eye gaze selection, in which the control options applicable to a given device are automatically pre-selected by means of the user directly looking at the device in their environment. This intuitive method therefore minimises the amount of navigation that the user must perform. To date, two main methods have been employed to achieve this direct eye-device control. The initial development using a head-mounted eye tracker was previously reported (Shi et al., 2006). This current paper describes subsequent development of the system (Shi et al., 2007) using a remote eye tracker which is simply situated before the user with no need for any attachment to them

Topics: eye-tracking, environmental control, object recognition, disabled people
Publisher: © COGAIN
Year: 2007
OAI identifier:

Suggested articles


  1. (2001). Bringing Gaze-based Interaction Back to Basics.
  2. (2005). Ein Fragebogen zur Messung von Depression bei degenerativen neurologischen Erkrankungen (amyotrophe
  3. (1994). Eyegaze Human-Computer Interface for People with Disabilities.
  4. (2006). ISONORM 9241/10-S: Kurzfragebogen zur Software-Evaluation.
  5. (2003). Phrase sets for evaluating text entry techniques.

To submit an update or takedown request for this paper, please submit an Update/Correction/Removal Request.