110,986 research outputs found

    The Computer-Controlled Oculometer: A Prototype Interactive Eye Movement Tracking System

    No full text
    One kind of eye movement tracking device which has great potential is the digital computer-controlled Oculometer, an instrument which non-invasively measures point of regard of the subject, as well as pupil diameter and blink occurrence. In conjunction with a computer-generated display which can change in real time as a function of the subject's eye motions, the computer-controlled Oculometer makes possible a variety of interactive measurement and control systems. Practical applications of such schemes have had to await the development of an instrument design which does not inconvenience the subject, and which conveniently interfaces with a digital computer (see ref. 1). This report describes an Oculometer subsystem and an eye-tracking/control program designed for use with the PDP-6 computer of the MIT Project MAC Artificial Intelligence Group. The oculometer electro-optic subsystem utilizes near-infrared light reflected specularly off the front surface of the subject's cornea and diffusely off the retina, producing a bright pupil with an overriding corneal highlight. An electro-optic scanning aperture vidissector within the unit, driven by a digital eye-tracking algorithm programmed into the PDP-6 computer, detects and tracks the centers of the corneal highlight and the bright pupil to give eve movement measurements. A computer-controlled, moving mirror head motion tracker directly coupled to the vidissector tracker permits the subject reasonable freedom of movement. Various applications of this system, which are suggested by the work reported here, include; (a) using the eye as a control device, (b) recording eye fixation and exploring patterns, (c) game playing, (d) training machines, and (e) psychophysiological testing and recording

    Investigating the effectiveness of an efficient label placement method using eye movement data

    Get PDF
    This paper focuses on improving the efficiency and effectiveness of dynamic and interactive maps in relation to the user. A label placement method with an improved algorithmic efficiency is presented. Since this algorithm has an influence on the actual placement of the name labels on the map, it is tested if this efficient algorithms also creates more effective maps: how well is the information processed by the user. We tested 30 participants while they were working on a dynamic and interactive map display. Their task was to locate geographical names on each of the presented maps. Their eye movements were registered together with the time at which a given label was found. The gathered data reveal no difference in the user's response times, neither in the number and the duration of the fixations between both map designs. The results of this study show that the efficiency of label placement algorithms can be improved without disturbing the user's cognitive map. Consequently, we created a more efficient map without affecting its effectiveness towards the user

    GazeDrone: Mobile Eye-Based Interaction in Public Space Without Augmenting the User

    Get PDF
    Gaze interaction holds a lot of promise for seamless human-computer interaction. At the same time, current wearable mobile eye trackers require user augmentation that negatively impacts natural user behavior while remote trackers require users to position themselves within a confined tracking range. We present GazeDrone, the first system that combines a camera-equipped aerial drone with a computational method to detect sidelong glances for spontaneous (calibration-free) gaze-based interaction with surrounding pervasive systems (e.g., public displays). GazeDrone does not require augmenting each user with on-body sensors and allows interaction from arbitrary positions, even while moving. We demonstrate that drone-supported gaze interaction is feasible and accurate for certain movement types. It is well-perceived by users, in particular while interacting from a fixed position as well as while moving orthogonally or diagonally to a display. We present design implications and discuss opportunities and challenges for drone-supported gaze interaction in public

    Pilots’ visual scan pattern and situation awareness in flight operations

    Get PDF
    Introduction: Situation awareness (SA) is considered an essential prerequisite for safe flying. If the impact of visual scanning patterns on a pilot’s situation awareness could be identified in flight operations, then eye-tracking tools could be integrated with flight simulators to improve training efficiency. Method: Participating in this research were 18 qualified, mission-ready fighter pilots. The equipment included high-fidelity and fixed-base type flight simulators and mobile head-mounted eye-tracking devices to record a subject’s eye movements and SA while performing air-to-surface tasks. Results: There were significant differences in pilots’ percentage of fixation in three operating phases: preparation (M = 46.09, SD = 14.79), aiming (M = 24.24, SD = 11.03), and release and break-away (M = 33.98, SD = 14.46). Also, there were significant differences in pilots’ pupil sizes, which were largest in the aiming phase (M = 27,621, SD = 6390.8), followed by release and break-away (M = 27,173, SD = 5830.46), then preparation (M = 25,710, SD = 6078.79), which was the smallest. Furthermore, pilots with better SA performance showed lower perceived workload (M = 30.60, SD = 17.86), and pilots with poor SA performance showed higher perceived workload (M = 60.77, SD = 12.72). Pilots’ percentage of fixation and average fixation duration among five different areas of interest showed significant differences as well. Discussion: Eye-tracking devices can aid in capturing pilots’ visual scan patterns and SA performance, unlike traditional flight simulators. Therefore, integrating eye-tracking devices into the simulator may be a useful method for promoting SA training in flight operations, and can provide in-depth understanding of the mechanism of visual scan patterns and information processing to improve training effectiveness in aviation
    • …
    corecore