93,272 research outputs found

    Dynamic gesture recognition using transformation invariant hand shape recognition

    Get PDF
    In this thesis a detailed framework is presented for accurate real time gesture recognition. Our approach to develop a hand-shape classifier, trained using computer animation, along with its application in dynamic gesture recognition is described. The system developed operates in real time and provides accurate gesture recognition. It operates using a single low resolution camera and operates in Matlab on a conventional PC running Windows XP. The hand shape classifier outlined in this thesis uses transformation invariant subspaces created using Principal Component Analysis (PCA). These subspaces are created from a large vocabulary created in a systematic maimer using computer animation. In recognising dynamic gestures we utilise both hand shape and hand position information; these are two o f the main features used by humans in distinguishing gestures. Hidden Markov Models (HMMs) are trained and employed to recognise this combination of hand shape and hand position features. During the course o f this thesis we have described in detail the inspiration and motivation behind our research and its possible applications. In this work our emphasis is on achieving a high speed system that works in real time with high accuracy

    Human gesture classification by brute-force machine learning for exergaming in physiotherapy

    Get PDF
    In this paper, a novel approach for human gesture classification on skeletal data is proposed for the application of exergaming in physiotherapy. Unlike existing methods, we propose to use a general classifier like Random Forests to recognize dynamic gestures. The temporal dimension is handled afterwards by majority voting in a sliding window over the consecutive predictions of the classifier. The gestures can have partially similar postures, such that the classifier will decide on the dissimilar postures. This brute-force classification strategy is permitted, because dynamic human gestures show sufficient dissimilar postures. Online continuous human gesture recognition can classify dynamic gestures in an early stage, which is a crucial advantage when controlling a game by automatic gesture recognition. Also, ground truth can be easily obtained, since all postures in a gesture get the same label, without any discretization into consecutive postures. This way, new gestures can be easily added, which is advantageous in adaptive game development. We evaluate our strategy by a leave-one-subject-out cross-validation on a self-captured stealth game gesture dataset and the publicly available Microsoft Research Cambridge-12 Kinect (MSRC-12) dataset. On the first dataset we achieve an excellent accuracy rate of 96.72%. Furthermore, we show that Random Forests perform better than Support Vector Machines. On the second dataset we achieve an accuracy rate of 98.37%, which is on average 3.57% better then existing methods

    A real-time human-robot interaction system based on gestures for assistive scenarios

    Get PDF
    Natural and intuitive human interaction with robotic systems is a key point to develop robots assisting people in an easy and effective way. In this paper, a Human Robot Interaction (HRI) system able to recognize gestures usually employed in human non-verbal communication is introduced, and an in-depth study of its usability is performed. The system deals with dynamic gestures such as waving or nodding which are recognized using a Dynamic Time Warping approach based on gesture specific features computed from depth maps. A static gesture consisting in pointing at an object is also recognized. The pointed location is then estimated in order to detect candidate objects the user may refer to. When the pointed object is unclear for the robot, a disambiguation procedure by means of either a verbal or gestural dialogue is performed. This skill would lead to the robot picking an object in behalf of the user, which could present difficulties to do it by itself. The overall system — which is composed by a NAO and Wifibot robots, a KinectTM v2 sensor and two laptops — is firstly evaluated in a structured lab setup. Then, a broad set of user tests has been completed, which allows to assess correct performance in terms of recognition rates, easiness of use and response times.Postprint (author's final draft

    Dynamic Hand Gesture Classification Based on Radar Micro-Doppler Signatures

    Get PDF
    Dynamic hand gesture recognition is of great importance for human-computer interaction. In this paper, we present a method to discriminate the four kinds of dynamic hand gestures, snapping fingers, flipping fingers, hand rotation and calling, using a radar micro-Doppler sensor. Two micro-Doppler features are extracted from the time-frequency spectrum and the support vector machine is used to classify these four kinds of gestures. The experimental results on measured data demonstrate that the proposed method can produce a classification accuracy higher than 88.56%

    Effect of sparsity-aware time–frequency analysis on dynamic hand gesture classification with radar micro-Doppler signatures

    Get PDF
    Dynamic hand gesture recognition is of great importance in human-computer interaction. In this study, the authors investigate the effect of sparsity-driven time-frequency analysis on hand gesture classification. The time-frequency spectrogram is first obtained by sparsity-driven time-frequency analysis. Then three empirical micro-Doppler features are extracted from the time-frequency spectrogram and a support vector machine is used to classify six kinds of dynamic hand gestures. The experimental results on measured data demonstrate that, compared to traditional time-frequency analysis techniques, sparsity-driven time-frequency analysis provides improved accuracy and robustness in dynamic hand gesture classification
    corecore