30 research outputs found

    Magic Pointing for Eyewear Computers

    Get PDF

    TobiiGlassesPySuite: An open-source suite for using the Tobii Pro Glasses 2 in eye-tracking studies

    Full text link
    In this paper we present the TobiiGlassesPySuite, an open-source suite we implemented for using the Tobii Pro Glasses 2 wearable eye-tracker in custom eye-tracking studies. We provide a platform-independent solution for controlling the device and for managing the recordings. The software consists of Python modules, integrated into a single package, accompanied by sample scripts and recordings. The proposed solution aims at providing additional methods with respect to the manufacturer's software, for allowing the users to exploit more the device's capabilities and the existing software. Our suite is available for download from the repository indicated in the paper and usable according to the terms of the GNU GPL v3.0 license

    Pupil Center as a Function of Pupil Diameter

    Get PDF

    6th international workshop on pervasive eye tracking and mobile eye-based interaction

    Get PDF
    Previous work on eye tracking and eye-based human-computer interfaces mainly concentrated on making use of the eyes in traditional desktop settings. With the recent growth of interest in wearable computers, such as smartwatches, smart eyewears and low-cost mobile eye trackers, eye-based interaction techniques for mobile computing are becoming increasingly important. PETMEI 2016 focuses on the pervasive eye tracking paradigm as a trailblazer for mobile eye-based interaction to take eye tracking out into the wild, to mobile and pervasive settings. We want to stimulate and explore the creativity of these communities with respect to the implications, key research challenges, and new applications for pervasive eye tracking in ubiquitous computing. The long-term goal is to create a strong interdisciplinary research community linking these fields together and to establish the workshop as the premier forum for research on pervasive eye tracking

    An investigation of the distribution of gaze estimation errors in head mounted gaze trackers using polynomial functions

    Get PDF
    Second order polynomials are commonly used for estimating the point-of-gaze in head-mounted eye trackers. Studies in remote (desktop) eye trackers show that although some non-standard 3rd order polynomial models could provide better accuracy, high-order polynomials do not necessarily provide better results. Different than remote setups though, where gaze is estimated over a relatively narrow field-of-view surface (e.g. less than 30x20 degrees on typical computer displays), head-mounted gaze trackers (HMGT) are often desired to cover a relatively wider field-of-view to make sure that the gaze is detected in the scene image even for extreme eye angles. In this paper we investigate the behavior of the gaze estimation error distribution throughout the image of the scene camera when using polynomial functions. Using simulated scenarios, we describe effects of four different sources of error: interpolation, extrapolation, parallax, and radial distortion. We show that the use of third order polynomials result in more accurate gaze estimates in HMGT, and that the use of wide angle lenses might be beneficial in terms of error reduction

    BimodalGaze:Seamlessly Refined Pointing with Gaze and Filtered Gestural Head Movement

    Get PDF
    Eye gaze is a fast and ergonomic modality for pointing but limited in precision and accuracy. In this work, we introduce BimodalGaze, a novel technique for seamless head-based refinement of a gaze cursor. The technique leverages eye-head coordination insights to separate natural from gestural head movement. This allows users to quickly shift their gaze to targets over larger fields of view with naturally combined eye-head movement, and to refine the cursor position with gestural head movement. In contrast to an existing baseline, head refinement is invoked automatically, and only if a target is not already acquired by the initial gaze shift. Study results show that users reliably achieve fine-grained target selection, but we observed a higher rate of initial selection errors affecting overall performance. An in-depth analysis of user performance provides insight into the classification of natural versus gestural head movement, for improvement of BimodalGaze and other potential applications

    A gaze interactive assembly instruction with pupillometric recording

    Get PDF
    This paper presents a study of a gaze interactive digital assembly instruction that provides concurrent logging of pupil data in a realistic task setting. The instruction allows hands-free gaze dwells as a substitute for finger clicks, and supports image rotation as well as image zooming by head movements. A user study in two LEGO toy stores with 72 children showed it to be immediately usable by 64 of them. Data logging of view-times and pupil dilations was possible for 59 participants. On average, the children spent half of the time attending to the instruction (S.D. 10.9%). The recorded pupil size showed a decrease throughout the building process, except when the child had to back-step: a regression was found to be followed by a pupil dilation. The main contribution of this study is to demonstrate gaze-tracking technology capable of supporting both robust interaction and concurrent, non-intrusive recording of gaze- and pupil data in-the-wild. Previous research has found pupil dilation to be associated with changes in task effort. However, other factors like fatigue, head motion, or ambient light may also have an impact. The final section summarizes our approach to this complexity of real-task pupil data collection and makes suggestions for how future applications may utilize pupil information

    A comparison of post-saccadic oscillations in European-Born and China-Born British University Undergraduates

    Get PDF
    Previous research has revealed that people from different genetic, racial, biological, and/or cultural backgrounds may display fundamental differences in eye-tracking behavior. These differences may have a cognitive origin or they may be at a lower level within the neurophysiology of the oculomotor network, or they may be related to environment factors. In this paper we investigated one of the physiological aspects of eye movements known as post-saccadic oscillations and we show that this type of eye movement is very different between two different populations. We compared the post-saccadic oscillations recorded by a video-based eye tracker between two groups of participants: European-born and Chinese-born British students. We recorded eye movements from a group of 42 Caucasians defined as White British or White Europeans and 52 Chinese-born participants all with ages ranging from 18 to 36 during a prosaccade task. The post-saccadic oscillations were extracted from the gaze data which was compared between the two groups in terms of their first overshoot and undershoot. The results revealed that the shape of the post-saccadic oscillations varied significantly between the two groups which may indicate a difference in a multitude of genetic, cultural, physiologic, anatomical or environmental factors. We further show that the differences in the post-saccadic oscillations could influence the oculomotor characteristics such as saccade duration. We conclude that genetic, racial, biological, and/or cultural differences can affect the morphology of the eye movement data recorded and should be considered when studying eye movements and oculomotor fixation and saccadic behaviors
    corecore