Human posture recognition based on multiple features and rule learning

Abstract

The use of skeleton data for human posture recognition is a key research topic in the human-computer interaction field. To improve the accuracy of human posture recognition, a new algorithm based on multiple features and rule learning is proposed in this paper. Firstly, a 219-dimensional vector that includes angle features and distance features is defined. Specifically, the angle and distance features are defined in terms of the local relationship between joints and the global spatial location of joints. Then, during human posture classification, the rule learning method is used together with the Bagging and random sub-Weili Ding space methods to create different samples and features for improved classification of sub-classifiers for different samples. Finally, the performance of our proposed algorithm is evaluated on four human posture datasets. The experimental results show that our algorithm can recognize many kinds of human postures effectively, and the results obtained by the rule-based learning method are of higher interpretability than those by traditional machine learning methods and CNNs

    Similar works