24 research outputs found

    Gaze–mouse coordinated movements and dependency with coordination demands in tracing.

    Get PDF
    Eye movements have been shown to lead hand movements in tracing tasks where subjects have to move their fingers along a predefined trace. The question remained, whether the leading relationship was similar when tracing with a pointing device, such as a mouse; more importantly, whether tasks that required more or less gaze–mouse coordination would introduce variation in this pattern of behaviour, in terms of both spatial and temporal leading of gaze position to mouse movement. A three-level gaze–mouse coordination demand paradigm was developed to address these questions. A substantial dataset of 1350 trials was collected and analysed. The linear correlation of gaze–mouse movements, the statistical distribution of the lead time, as well as the lead distance between gaze and mouse cursor positions were all considered, and we proposed a new method to quantify lead time in gaze–mouse coordination. The results supported and extended previous empirical findings that gaze often led mouse movements. We found that the gaze–mouse coordination demands of the task were positively correlated to the gaze lead, both spatially and temporally. However, the mouse movements were synchronised with or led gaze in the simple straight line condition, which demanded the least gaze–mouse coordination

    HoverZoom

    No full text

    Rationalizing the Need of Architecture-Driven Testing of Interactive Systems

    Get PDF
    Part 3: Task Modelling and Task-Based ApproachesInternational audienceTesting interactive systems is known to be a complex task that cannot be exhaustive. Indeed, the infinite number of combination of user input and the complexity of information presentation exceed the practical limits of exhaustive and analytical approach to testing [31]. Most interactive software testing techniques are produced by applying and tuning techniques from the field of software testing to try to address the specificities of interactive applications. When some elements cannot be taken into account by the software testing technique, they are usually ignored. In this paper we propose to follow an opposite approach, starting from a generic architecture for interactive systems (including both software and hardware elements) for identifying in a systematic way, testing problems and testing needs. This architecture-driven approach makes it possible to identify how software testing knowledge and techniques can support interactive systems testing but also where the interactive systems engineering community should invest in order to test their idiosyncrasies too

    Effect of Control-Display Gain and Mapping and Use of Armrests on Accuracy in Temporally Limited Touchless Gestural Steering Tasks

    No full text
    Touchless gestural controls are becoming an important natural input technique for interaction with emerging virtual environments but design parameters that improve task performance while at the same time reduce user fatigue require investigation. This experiment aims to understand how control-display (CD) parameters such as gain and mapping as well as the use of armrests affect gesture accuracy in specific movement directions. Twelve participants completed temporally constrained two-dimensional steering tasks using free-hand fingertip gestures in several conditions. Use of an armrest, increased CD gain, and horizontal mapping significantly reduced success rate. The results show that optimal transfer functions for gestures will depend on the movement direction as well as arm support features
    corecore