Unsupervised and supervised methods for automatic human hand pose recognition for robotics and human robot interaction

Abstract

Hand Pose Recognition (HPR) play an important role in human-computer interactions (HCI) and human-robot interactions (HRI). However, since the hand pose has high degrees of freedom (DoF) for joints, and poses are always flexible, hand pose estimation with high precision is still a challenge problem. In this work, a two-stage HPR system is proposed. In the first stage, I implement a Hand Pose Reconstruction algorithm, and a non-vision based unsupervised HPR method. I describe the general procedure, model construction, and experimental results of tracking hand kinematics using extended Kalman filter (EKF) based on data recorded from active surface markers. I used a hand model with 26 DoF that consists of hand global posture, and digits. The reconstructions obtained from four different subjects were used to implement an unsupervised method to recognize hand actions during grasping and manipulation tasks, showing a high degree of accuracy. In the second stage, I implement a vision-based supervised HPR method. Deep Neural Networks (DNNs) is applied to automatically learn features from hand posture images. Images consist of frames extracted from grasping and manipulation task videos. Such videos are divided into intervals and at each interval is associated a specific action by a supervisor. Experiments verify that the proposed system achieves a recognition accuracy as high as 85.12%

    Similar works