4,936 research outputs found

    Human Performance Modeling For Two-Dimensional Dwell-Based Eye Pointing

    Get PDF
    Recently, Zhang et al. (2010) proposed an effective performance model for dwell-based eye pointing. However, their model was based on a specific circular target condition, without the ability to predict the performance of acquiring conventional rectangular targets. Thus, the applicability of such a model is limited. In this paper, we extend their one-dimensional model to two-dimensional (2D) target conditions. Carrying out two experiments, we have evaluated the abilities of different model candidates to find out the most appropriate one. The new index of difficulty we redefine for 2D eye pointing (IDeye) can properly reflect the asymmetrical impact of target width and height, which the later exceeds the former, and consequently the IDeyemodel can accurately predict the performance for 2D targets. Importantly, we also find that this asymmetry still holds for varying movement directions. According to the results of our study, we provide useful implications and recommendations for gaze-based interactions

    Suppression of biodynamic interference in head-tracked teleoperation

    Get PDF
    The utility of helmet-tracked sights to provide pointing commands for teleoperation of cameras, lasers, or antennas in aircraft is degraded by the presence of uncommanded, involuntary heat motion, referred to as biodynamic interference. This interference limits the achievable precision required in pointing tasks. The noise contributions due to biodynamic interference consists of an additive component which is correlated with aircraft vibration and an uncorrelated, nonadditive component, referred to as remnant. An experimental simulation study is described which investigated the improvements achievable in pointing and tracking precision using dynamic display shifting in the helmet-mounted display. The experiment was conducted in a six degree of freedom motion base simulator with an emulated helmet-mounted display. Highly experienced pilot subjects performed precision head-pointing tasks while manually flying a visual flight-path tracking task. Four schemes using adaptive and low-pass filtering of the head motion were evaluated to determine their effects on task performance and pilot workload in the presence of whole-body vibration characteristic of helicopter flight. The results indicate that, for tracking tasks involving continuously moving targets, improvements of up to 70 percent can be achieved in percent on-target dwelling time and of up to 35 percent in rms tracking error, with the adaptive plus low-pass filter configuration. The results with the same filter configuration for the task of capturing randomly-positioned, stationary targets show an increase of up to 340 percent in the number of targets captured and an improvement of up to 24 percent in the average capture time. The adaptive plus low-pass filter combination was considered to exhibit the best overall display dynamics by each of the subjects

    Eye Pointing in Stereoscopic Displays

    Get PDF
    This study investigated eye pointing in stereoscopic displays. Ten participants performed 18 tapping tasks in stereoscopic displays with three different levels of parallax (at the screen, 20 cm and 50 cm in front of the screen). The results showed that parallax had significant effects on hand movement time, eye movement time, index of performance in hand click and eye gaze. The movement time was shorter and the performance was better when the target was at the screen, compared to the conditions when the targets were seen at 20 cm and 50 cm in front of the screen. Furthermore, the findings of this study supports that the eye movement in stereoscopic displays follows the Fitts’ law. The proposed algorithm was effective on the eye gaze selection to improve the good fit of eye movement in stereoscopic displays

    Exploration of an oculometer-based model of pilot workload

    Get PDF
    Potential relationships between eye behavior and pilot workload are discussed. A Honeywell Mark IIA oculometer was used to obtain the eye data in a fixed base transport aircraft simulation facility. The data were analyzed to determine those parameters of eye behavior which were related to changes in level of task difficulty of the simulated manual approach and landing on instruments. A number of trends and relationships between eye variables and pilot ratings were found. A preliminary equation was written based on the results of a stepwise linear regression. High variability in time spent on various instruments was related to differences in scanning strategy among pilots. A more detailed analysis of individual runs by individual pilots was performed to investigate the source of this variability more closely. Results indicated a high degree of intra-pilot variability in instrument scanning. No consistent workload related trends were found. Pupil diameter which had demonstrated a strong relationship to task difficulty was extensively re-exmained

    Enhancing the Performance of Eye and Head Mice: A Validated Assessment Method and an Investigation into the Performance of Eye and Head Based Assistive Technology Pointing Devices

    Get PDF
    This work poses the question "Could eye and head based assistive technology device interaction performance approach that of basic hand mouse interaction?" To this aim, the work constructs, validates, and applies a detailed and comprehensive pointing device assessment method suitable for assistive technology direct pointing devices, it then uses this method to add enhancement to these devices, finally it then demonstrates that such enhanced eye or head based pointing can approach that of basic hand mouse interaction and be a viable and usable interaction method for people with high-level motor disabilities. Eye and head based pointing devices, or eye and head mice, are often used by high-level motor disabled people to enable computer interaction in the place of a standard desktop hand mouse. The performance of these eye and head mice pointing devices when used for direct manipulation on a standard graphical user interface has generally been regarded as poor in comparison to that of a standard desktop hand mouse, thus putting users of head and eye mice at a disadvantage when interacting with computers. The performance of eye and head based pointing devices during direct manipulation on a standard graphical user interface has not previously been investigated in depth, and the reasons why these devices seem to demonstrate poor performance have not been determined in detail. Few proven methods have been demonstrated and investigated that enhance the performance of these devices based on their performance during direct manipulation. Importantly, and key to this work is that, no validated assessment method has been constructed to allow such an investigation. This work seeks to investigate the performance of eye and head based pointing devices during direct manipulation by constructing and verifying a test method suitable for the detailed performance assessment of eye and head based assistive technology pointing devices. It then uses this method to determine the factors influencing the performance of eye and head mice during direct manipulation. Finally, after identifying these factors, this work hypothesises, and then demonstrates that applying suitable methods for addressing these factors can result in enhanced performance for eye and head mice. It shows that the performance of these enhanced devices can approach the performance of standard desktop hand mice with the use of highly experienced users, together with the enhancement of a supporting modality for object manipulation, and a supporting interface enhancement for object size magnification; thus demonstrating that these devices can approach and equal the performance of basic hand mouse interaction

    TapGazer:Text Entry with finger tapping and gaze-directed word selection

    Get PDF

    An investigation into gaze-based interaction techniques for people with motor impairments

    Get PDF
    The use of eye movements to interact with computers offers opportunities for people with impaired motor ability to overcome the difficulties they often face using hand-held input devices. Computer games have become a major form of entertainment, and also provide opportunities for social interaction in multi-player environments. Games are also being used increasingly in education to motivate and engage young people. It is important that young people with motor impairments are able to benefit from, and enjoy, them. This thesis describes a program of research conducted over a 20-year period starting in the early 1990's that has investigated interaction techniques based on gaze position intended for use by people with motor impairments. The work investigates how to make standard software applications accessible by gaze, so that no particular modification to the application is needed. The work divides into 3 phases. In the first phase, ways of using gaze to interact with the graphical user interfaces of office applications were investigated, designed around the limitations of gaze interaction. Of these, overcoming the inherent inaccuracies of pointing by gaze at on-screen targets was particularly important. In the second phase, the focus shifted from office applications towards immersive games and on-line virtual worlds. Different means of using gaze position and patterns of eye movements, or gaze gestures, to issue commands were studied. Most of the testing and evaluation studies in this, like the first, used participants without motor-impairments. The third phase of the work then studied the applicability of the research findings thus far to groups of people with motor impairments, and in particular,the means of adapting the interaction techniques to individual abilities. In summary, the research has shown that collections of specialised gaze-based interaction techniques can be built as an effective means of completing the tasks in specific types of games and how these can be adapted to the differing abilities of individuals with motor impairments
    corecore