2 research outputs found

    Usability Analysis of an off-the-shelf Hand Posture Estimation Sensor for Freehand Physical Interaction in Egocentric Mixed Reality

    Get PDF
    This paper explores freehand physical interaction in egocentric Mixed Reality by performing a usability study on the use of hand posture estimation sensors. We report on precision, interactivity and usability metrics in a task-based user study, exploring the importance of additional visual cues when interacting. A total of 750 interactions were recorded from 30 participants performing 5 different interaction tasks (Move, Rotate: Pitch (Y axis) and Yaw (Z axis), Uniform scale: enlarge and shrink). Additional visual cues resulted in an average shorter time to interact, however, no consistent statistical differences were found in between groups for performance and precision results. The group with additional visual cues gave the system and average System Usability Scale (SUS) score of 72.33 (SD = 16.24) while the other scored a 68.0 (SD = 18.68). Overall, additional visual cues made the system being perceived as more usable, despite the fact that the use of these two different conditions had limited effect on precision and interactivity metrics

    Usability Analysis of an off-the-shelf Hand Posture Estimation Sensor for Freehand Physical Interaction in Egocentric Mixed Reality

    No full text
    This paper explores freehand physical interaction in egocentric Mixed Reality by performing a usability study on the use of hand posture estimation sensors. We report on precision, interactivity and usability metrics in a task-based user study, exploring the importance of additional visual cues when interacting. A total of 750 interactions were recorded from 30 participants performing 5 different interaction tasks (Move, Rotate: Pitch (Y axis) and Yaw (Z axis), Uniform scale: enlarge and shrink). Additional visual cues resulted in an average shorter time to interact, however, no consistent statistical differences were found in between groups for performance and precision results. The group with additional visual cues gave the system and average System Usability Scale (SUS) score of 72.33 (SD = 16.24) while the other scored a 68.0 (SD = 18.68). Overall, additional visual cues made the system being perceived as more usable, despite the fact that the use of these two different conditions had limited effect on precision and interactivity metrics
    corecore