Gesture imitation and recognition using Kinect sensor and extreme learning machines

Abstract

This study presents a framework that recognizes and imitates human upper-body motions in real time. The framework consists of two parts. In the first part, a transformation algorithm is applied to 3D human motion data captured by a Kinect. The data are then converted into the robot’s joint angles by the algorithm. The human upper-body motions are successfully imitated by the NAO humanoid robot in real time. In the second part, the human action recognition algorithm is implemented for upper-body gestures. A human action dataset is also created for the upper-body movements. Each action is performed 10 times by twenty-four users. The collected joint angles are divided into six action classes. Extreme Learning Machines (ELMs) are used to classify the human actions. Additionally, the Feed-Forward Neural Networks (FNNs) and K-Nearest Neighbor (K-NN) classifiers are used for comparison. According to the comparative results, ELMs produce a good human action recognition performance

Similar works

This paper was published in Fırat Üniversitesi Kurumsal Açık Arşiv.

Having an issue?

Is data on this page outdated, violates copyrights or anything else? Report the problem now and we will take corresponding actions after reviewing your request.