3,475 research outputs found

    Human Performance Modeling For Two-Dimensional Dwell-Based Eye Pointing

    Get PDF
    Recently, Zhang et al. (2010) proposed an effective performance model for dwell-based eye pointing. However, their model was based on a specific circular target condition, without the ability to predict the performance of acquiring conventional rectangular targets. Thus, the applicability of such a model is limited. In this paper, we extend their one-dimensional model to two-dimensional (2D) target conditions. Carrying out two experiments, we have evaluated the abilities of different model candidates to find out the most appropriate one. The new index of difficulty we redefine for 2D eye pointing (IDeye) can properly reflect the asymmetrical impact of target width and height, which the later exceeds the former, and consequently the IDeyemodel can accurately predict the performance for 2D targets. Importantly, we also find that this asymmetry still holds for varying movement directions. According to the results of our study, we provide useful implications and recommendations for gaze-based interactions

    Suppression of biodynamic interference in head-tracked teleoperation

    Get PDF
    The utility of helmet-tracked sights to provide pointing commands for teleoperation of cameras, lasers, or antennas in aircraft is degraded by the presence of uncommanded, involuntary heat motion, referred to as biodynamic interference. This interference limits the achievable precision required in pointing tasks. The noise contributions due to biodynamic interference consists of an additive component which is correlated with aircraft vibration and an uncorrelated, nonadditive component, referred to as remnant. An experimental simulation study is described which investigated the improvements achievable in pointing and tracking precision using dynamic display shifting in the helmet-mounted display. The experiment was conducted in a six degree of freedom motion base simulator with an emulated helmet-mounted display. Highly experienced pilot subjects performed precision head-pointing tasks while manually flying a visual flight-path tracking task. Four schemes using adaptive and low-pass filtering of the head motion were evaluated to determine their effects on task performance and pilot workload in the presence of whole-body vibration characteristic of helicopter flight. The results indicate that, for tracking tasks involving continuously moving targets, improvements of up to 70 percent can be achieved in percent on-target dwelling time and of up to 35 percent in rms tracking error, with the adaptive plus low-pass filter configuration. The results with the same filter configuration for the task of capturing randomly-positioned, stationary targets show an increase of up to 340 percent in the number of targets captured and an improvement of up to 24 percent in the average capture time. The adaptive plus low-pass filter combination was considered to exhibit the best overall display dynamics by each of the subjects

    Eye Pointing in Stereoscopic Displays

    Get PDF
    This study investigated eye pointing in stereoscopic displays. Ten participants performed 18 tapping tasks in stereoscopic displays with three different levels of parallax (at the screen, 20 cm and 50 cm in front of the screen). The results showed that parallax had significant effects on hand movement time, eye movement time, index of performance in hand click and eye gaze. The movement time was shorter and the performance was better when the target was at the screen, compared to the conditions when the targets were seen at 20 cm and 50 cm in front of the screen. Furthermore, the findings of this study supports that the eye movement in stereoscopic displays follows the Fitts’ law. The proposed algorithm was effective on the eye gaze selection to improve the good fit of eye movement in stereoscopic displays

    Dynamics of eye-hand coordination are flexibly preserved in eye-cursor coordination during an online, digital, object interaction task

    Full text link
    Do patterns of eye-hand coordination observed during real-world object interactions apply to digital, screen-based object interactions? We adapted a real-world object interaction task (physically transferring cups in sequence about a tabletop) into a two-dimensional screen-based task (dragging-and-dropping circles in sequence with a cursor). We collected gaze (with webcam eye-tracking) and cursor position data from 51 fully-remote, crowd-sourced participants who performed the task on their own computer. We applied real-world time-series data segmentation strategies to resolve the self-paced movement sequence into phases of object interaction and rigorously cleaned the webcam eye-tracking data. In this preliminary investigation, we found that: 1) real-world eye-hand coordination patterns persist and adapt in this digital context, and 2) remote, online, cursor-tracking and webcam eye-tracking are useful tools for capturing visuomotor behaviours during this ecologically-valid human-computer interaction task. We discuss how these findings might inform design principles and further investigations into natural behaviours that persist in digital environments

    Enhancing the Performance of Eye and Head Mice: A Validated Assessment Method and an Investigation into the Performance of Eye and Head Based Assistive Technology Pointing Devices

    Get PDF
    This work poses the question "Could eye and head based assistive technology device interaction performance approach that of basic hand mouse interaction?" To this aim, the work constructs, validates, and applies a detailed and comprehensive pointing device assessment method suitable for assistive technology direct pointing devices, it then uses this method to add enhancement to these devices, finally it then demonstrates that such enhanced eye or head based pointing can approach that of basic hand mouse interaction and be a viable and usable interaction method for people with high-level motor disabilities. Eye and head based pointing devices, or eye and head mice, are often used by high-level motor disabled people to enable computer interaction in the place of a standard desktop hand mouse. The performance of these eye and head mice pointing devices when used for direct manipulation on a standard graphical user interface has generally been regarded as poor in comparison to that of a standard desktop hand mouse, thus putting users of head and eye mice at a disadvantage when interacting with computers. The performance of eye and head based pointing devices during direct manipulation on a standard graphical user interface has not previously been investigated in depth, and the reasons why these devices seem to demonstrate poor performance have not been determined in detail. Few proven methods have been demonstrated and investigated that enhance the performance of these devices based on their performance during direct manipulation. Importantly, and key to this work is that, no validated assessment method has been constructed to allow such an investigation. This work seeks to investigate the performance of eye and head based pointing devices during direct manipulation by constructing and verifying a test method suitable for the detailed performance assessment of eye and head based assistive technology pointing devices. It then uses this method to determine the factors influencing the performance of eye and head mice during direct manipulation. Finally, after identifying these factors, this work hypothesises, and then demonstrates that applying suitable methods for addressing these factors can result in enhanced performance for eye and head mice. It shows that the performance of these enhanced devices can approach the performance of standard desktop hand mice with the use of highly experienced users, together with the enhancement of a supporting modality for object manipulation, and a supporting interface enhancement for object size magnification; thus demonstrating that these devices can approach and equal the performance of basic hand mouse interaction

    Non-Linear Signal Processing methods for UAV detections from a Multi-function X-band Radar

    Full text link
    This article develops the applicability of non-linear processing techniques such as Compressed Sensing (CS), Principal Component Analysis (PCA), Iterative Adaptive Approach (IAA) and Multiple-input-multiple-output (MIMO) for the purpose of enhanced UAV detections using portable radar systems. The combined scheme has many advantages and the potential for better detection and classification accuracy. Some of the benefits are discussed here with a phased array platform in mind, the novel portable phased array Radar (PWR) by Agile RF Systems (ARS), which offers quadrant outputs. CS and IAA both show promising results when applied to micro-Doppler processing of radar returns owing to the sparse nature of the target Doppler frequencies. This shows promise in reducing the dwell time and increase the rate at which a volume can be interrogated. Real-time processing of target information with iterative and non-linear solutions is possible now with the advent of GPU-based graphics processing hardware. Simulations show promising results

    An investigation into gaze-based interaction techniques for people with motor impairments

    Get PDF
    The use of eye movements to interact with computers offers opportunities for people with impaired motor ability to overcome the difficulties they often face using hand-held input devices. Computer games have become a major form of entertainment, and also provide opportunities for social interaction in multi-player environments. Games are also being used increasingly in education to motivate and engage young people. It is important that young people with motor impairments are able to benefit from, and enjoy, them. This thesis describes a program of research conducted over a 20-year period starting in the early 1990's that has investigated interaction techniques based on gaze position intended for use by people with motor impairments. The work investigates how to make standard software applications accessible by gaze, so that no particular modification to the application is needed. The work divides into 3 phases. In the first phase, ways of using gaze to interact with the graphical user interfaces of office applications were investigated, designed around the limitations of gaze interaction. Of these, overcoming the inherent inaccuracies of pointing by gaze at on-screen targets was particularly important. In the second phase, the focus shifted from office applications towards immersive games and on-line virtual worlds. Different means of using gaze position and patterns of eye movements, or gaze gestures, to issue commands were studied. Most of the testing and evaluation studies in this, like the first, used participants without motor-impairments. The third phase of the work then studied the applicability of the research findings thus far to groups of people with motor impairments, and in particular,the means of adapting the interaction techniques to individual abilities. In summary, the research has shown that collections of specialised gaze-based interaction techniques can be built as an effective means of completing the tasks in specific types of games and how these can be adapted to the differing abilities of individuals with motor impairments

    Auxilio: A Sensor-Based Wireless Head-Mounted Mouse for People with Upper Limb Disability

    Full text link
    Upper limb disability may be caused either due to accidents, neurological disorders, or even birth defects, imposing limitations and restrictions on the interaction with a computer for the concerned individuals using a generic optical mouse. Our work proposes the design and development of a working prototype of a sensor-based wireless head-mounted Assistive Mouse Controller (AMC), Auxilio, facilitating interaction with a computer for people with upper limb disability. Combining commercially available, low-cost motion and infrared sensors, Auxilio solely utilizes head and cheek movements for mouse control. Its performance has been juxtaposed with that of a generic optical mouse in different pointing tasks as well as in typing tasks, using a virtual keyboard. Furthermore, our work also analyzes the usability of Auxilio, featuring the System Usability Scale. The results of different experiments reveal the practicality and effectiveness of Auxilio as a head-mounted AMC for empowering the upper limb disabled community.Comment: 28 pages, 9 figures, 5 table
    corecore