International Academy, Research, and Industrial Association
Abstract
Recent developments in the field of eye-gaze tracking by vidoeoculography indicate a growing interest towards unobtrusive tracking in real-life scenarios, a new paradigm referred to as pervasive eye-gaze tracking. Among the challenges associated with this paradigm, the capability of a tracking platform to integrate well into devices with in-built imaging hardware and to permit natural head movement during tracking is of importance in less constrained scenarios. The work presented in this paper builds on our earlier work, which addressed the problem of estimating on-screen point-of-regard from iris center movements captured by an integrated camera inside a notebook computer, by proposing a method to approximate the head movements in conjunction with the iris movements in order to alleviate the requirement for a stationary head pose. Following iris localization by an appearance-based method, linear mapping functions for the iris and head movement are computed during a brief calibration procedure permitting the image information to be mapped to a point-of-regard on the monitor screen. Following the calculation of the point-of-regard as a function of the iris and head movement, separate Kalman filters improve upon the noisy point-of-regard estimates to smoothen the trajectory of the mouse cursor on the monitor screen. Quantitative and qualitative results obtained from two validation procedures reveal an improvement in the estimation accuracy under natural head movement, over our previous results achieved from earlier work.peer-reviewe