1 research outputs found

    Identifying the Usability Factors of Mid-Air Hand Gestures for 3D Virtual Model Manipulation

    No full text
    Although manipulating 3D virtual models with mid-air hand gestures had the benefits of natural interactions and free from the sanitation problems of touch surfaces, many factors could influence the usability of such an interaction paradigm. In this research, the authors conducted experiments to study the vision-based mid-air hand gestures for scaling, translating, and rotating a 3D virtual car displayed on a large screen. An Intel RealSense 3D Camera was employed for hand gesture recognition. The two-hand gesture with grabbing then moving apart/close to each other was applied to enlarging/shrinking the 3D virtual car. The one-hand gesture with grabbing then moving was applied to translating a car component. The two-hand gesture with grabbing and moving relatively along the circumference of a horizontal circle was applied to rotating the car. Seventeen graduate students were invited to participate in the experiments and offer their evaluations and comments for gesture usability. The results indicated that the width and depth of detection ranges were the key usability factors for two-hand gestures with linear motions. For dynamic gestures with quick transitions and motions from open to close hand poses, ensuring gesture recognition robustness was extremely important. Furthermore, given a gesture with ergonomic postures, inappropriate control-response ratio could result in fatigue due to repetitive exertions of hand gestures for achieving the precise controls of 3D model manipulation tasks
    corecore