7 research outputs found
A robust method for VR-based hand gesture recognition using density-based CNN
Many VR-based medical purposes applications have been developed to help patients with mobility decrease caused by accidents, diseases, or other injuries to do physical treatment efficiently. VR-based applications were considered more effective helper for individual physical treatment because of their low-cost equipment and flexibility in time and space, less assistance of a physical therapist. A challenge in developing a VR-based physical treatment was understanding the body part movement accurately and quickly. We proposed a robust pipeline to understanding hand motion accurately. We retrieved our data from movement sensors such as HTC vive and leap motion. Given a sequence position of palm, we represent our data as binary 2D images of gesture shape. Our dataset consisted of 14 kinds of hand gestures recommended by a physiotherapist. Given 33 3D points that were mapped into binary images as input, we trained our proposed density-based CNN. Our CNN model concerned with our input characteristics, having many 'blank block pixels', 'single-pixel thickness' shape and generated as a binary image. Pyramid kernel size applied on the feature extraction part and classification layer using softmax as loss function, have given 97.7% accuracy
Dynamic Hand Gesture Recognition of Arabic Sign Language using Hand Motion Trajectory Features
In this paper we propose a system for dynamic hand gesture recognition of Arabic Sign Language The proposed system takes the dynamic gesture video stream as input extracts hand area and computes hand motion features then uses these features to recognize the gesture The system identifies the hand blob using YCbCr color space to detect skin color of hand The system classifies the input pattern based on correlation coefficients matching technique The significance of the system is its simplicity and ability to recognize the gestures independent of skin color and physical structure of the performers The experiment results show that the gesture recognition rate of 20 different signs performed by 8 different signers is 85 6
Recommended from our members
Study in User Preferred Pen Gestures for Controlling a Virtual Character
Controlling a virtual character with a pen input device is difficult. Pen input
devices require freeform gestures and users are not confined to particular mapping of a
key or a button that is exactly repeatable. This is a problem since an intuitive motion
gesture for one user might not be intuitive for another user. In this paper, we explore user
preferred input gestures for character control through user experiments. Most previous
pen input gesture sets are based on the preference of the developer. Our goal is to try to
find common pen gesture features for common commands through user experiments. For
our experiment, we have chosen navigational motions that are common for controlling a
character in a virtual world. The users were asked to make gestures for a set of
navigational motions according to their intuition. We then analyzed the gesture data and
outlined some gesture design guidelines as well as compared the resulting gestures to
those used in existing applications that use pen devices for character control