721 research outputs found

    South African sign language dataset development and translation : a glove-based approach

    Get PDF
    Includes bibliographical references.There has been a definite breakdown of communication between the hearing and the Deaf communities. This communication gap drastically effects many facets of a Deaf person’s life, including education, job opportunities and quality of life. Researchers have turned to technology in order to remedy this issue using Automatic Sign Language. While there has been successful research around the world, this is not possible in South Africa as there is no South African Sign Language (SASL) database available. This research aims to develop a SASL static gesture database using a data glove as the first step towards developing a comprehensive database that encapsulates the entire language. Unfortunately commercial data gloves are expensive and so as part of this research, a low-cost data glove will be developed for the application of Automatic Sign Language Translation. The database and data glove will be used together with Neural Networks to perform gesture classification. This will be done in order to evaluate the gesture data collected for the database. This research project has been broken down into three main sections; data glove development, database creation and gesture classification. The data glove was developed by critically reviewing the relevant literature, testing the sensors and then evaluating the overall glove for repeatability and reliability. The final data glove prototype was constructed and five participants were used to collect 31 different static gestures in three different scenarios, which range from isolated gesture collection to continuous data collection. This data was cleaned and used to train a neural network for the purpose of classification. Several training algorithms were chosen and compared to see which attained the highest classification accuracy. The data glove performed well and achieved results superior to some research and on par with other researchers’ results. The data glove achieved a repeatable angle range of 3.27 degrees resolution with a standard deviation of 1.418 degrees. This result is far below the specified 15 degrees resolution required for the research. The device remained low-cost and was more than $100 cheaper than other custom research data gloves and hundreds of dollars cheaper than commercial data gloves. A database was created using five participants and 1550 type 1 gestures, 465 type 2 gestures and 93 type 3 gestures were collected. The Resilient Back-Propagation and Levenberg-Marquardt training algorithms were considered as the training algorithms for the neural network. The Levenberg-Marquardt algorithm had a superior classification accuracy achieving 99.61%, 77.42% and 81.72% accuracy on the type 1, type 2 and type 3 data respectively

    Real-time Immersive human-computer interaction based on tracking and recognition of dynamic hand gestures

    Get PDF
    With fast developing and ever growing use of computer based technologies, human-computer interaction (HCI) plays an increasingly pivotal role. In virtual reality (VR), HCI technologies provide not only a better understanding of three-dimensional shapes and spaces, but also sensory immersion and physical interaction. With the hand based HCI being a key HCI modality for object manipulation and gesture based communication, challenges are presented to provide users a natural, intuitive, effortless, precise, and real-time method for HCI based on dynamic hand gestures, due to the complexity of hand postures formed by multiple joints with high degrees-of-freedom, the speed of hand movements with highly variable trajectories and rapid direction changes, and the precision required for interaction between hands and objects in the virtual world. Presented in this thesis is the design and development of a novel real-time HCI system based on a unique combination of a pair of data gloves based on fibre-optic curvature sensors to acquire finger joint angles, a hybrid tracking system based on inertia and ultrasound to capture hand position and orientation, and a stereoscopic display system to provide an immersive visual feedback. The potential and effectiveness of the proposed system is demonstrated through a number of applications, namely, hand gesture based virtual object manipulation and visualisation, hand gesture based direct sign writing, and hand gesture based finger spelling. For virtual object manipulation and visualisation, the system is shown to allow a user to select, translate, rotate, scale, release and visualise virtual objects (presented using graphics and volume data) in three-dimensional space using natural hand gestures in real-time. For direct sign writing, the system is shown to be able to display immediately the corresponding SignWriting symbols signed by a user using three different signing sequences and a range of complex hand gestures, which consist of various combinations of hand postures (with each finger open, half-bent, closed, adduction and abduction), eight hand orientations in horizontal/vertical plans, three palm facing directions, and various hand movements (which can have eight directions in horizontal/vertical plans, and can be repetitive, straight/curve, clockwise/anti-clockwise). The development includes a special visual interface to give not only a stereoscopic view of hand gestures and movements, but also a structured visual feedback for each stage of the signing sequence. An excellent basis is therefore formed to develop a full HCI based on all human gestures by integrating the proposed system with facial expression and body posture recognition methods. Furthermore, for finger spelling, the system is shown to be able to recognise five vowels signed by two hands using the British Sign Language in real-time

    Sign language recognition using wearable electronics: Implementing K-nearest neighbors with dynamic time warping and convolutional neural network algorithms

    Get PDF
    We propose a sign language recognition system based on wearable electronics and two different classification algorithms. The wearable electronics were made of a sensory glove and inertial measurement units to gather fingers, wrist, and arm/forearm movements. The classifiers were k-Nearest Neighbors with Dynamic Time Warping (that is a non-parametric method) and Convolutional Neural Networks (that is a parametric method). Ten sign-words were considered from the Italian Sign Language: cose, grazie, maestra, together with words with international meaning such as google, internet, jogging, pizza, television, twitter, and ciao. The signs were repeated one-hundred times each by seven people, five male and two females, aged 29–54 y ± 10.34 (SD). The adopted classifiers performed with an accuracy of 96.6% ± 3.4 (SD) for the k-Nearest Neighbors plus the Dynamic Time Warping and of 98.0% ± 2.0 (SD) for the Convolutional Neural Networks. Our system was made of wearable electronics among the most complete ones, and the classifiers top performed in comparison with other relevant works reported in the literature

    STUDY OF HAND GESTURE RECOGNITION AND CLASSIFICATION

    Get PDF
    To recognize different hand gestures and achieve efficient classification to understand static and dynamic hand movements used for communications.Static and dynamic hand movements are first captured using gesture recognition devices including Kinect device, hand movement sensors, connecting electrodes, and accelerometers. These gestures are processed using hand gesture recognition algorithms such as multivariate fuzzy decision tree, hidden Markov models (HMM), dynamic time warping framework, latent regression forest, support vector machine, and surface electromyogram. Hand movements made by both single and double hands are captured by gesture capture devices with proper illumination conditions. These captured gestures are processed for occlusions and fingers close interactions for identification of right gesture and to classify the gesture and ignore the intermittent gestures. Real-time hand gestures recognition needs robust algorithms like HMM to detect only the intended gesture. Classified gestures are then compared for the effectiveness with training and tested standard datasets like sign language alphabets and KTH datasets. Hand gesture recognition plays a very important role in some of the applications such as sign language recognition, robotics, television control, rehabilitation, and music orchestration

    Designing 3D scenarios and interaction tasks for immersive environments

    Get PDF
    In the world of today, immersive reality such as virtual and mixed reality, is one of the most attractive research fields. Virtual Reality, also called VR, has a huge potential to be used in in scientific and educational domains by providing users with real-time interaction or manipulation. The key concept in immersive technologies to provide a high level of immersive sensation to the user, which is one of the main challenges in this field. Wearable technologies play a key role to enhance the immersive sensation and the degree of embodiment in virtual and mixed reality interaction tasks. This project report presents an application study where the user interacts with virtual objects, such as grabbing objects, open or close doors and drawers while wearing a sensory cyberglove developed in our lab (Cyberglove-HT). Furthermore, it presents the development of a methodology that provides inertial measurement unit(IMU)-based gesture recognition. The interaction tasks and 3D immersive scenarios were designed in Unity 3D. Additionally, we developed an inertial sensor-based gesture recognition by employing an Long short-term memory (LSTM) network. In order to distinguish the effect of wearable technologies in the user experience in immersive environments, we made an experimental study comparing the Cyberglove-HT to standard VR controllers (HTC Vive Controller). The quantitive and subjective results indicate that we were able to enhance the immersive sensation and self embodiment with the Cyberglove-HT. A publication resulted from this work [1] which has been developed in the framework of the R&D project Human Tracking and Perception in Dynamic Immersive Rooms (HTPDI

    Assessment of Hand Gestures Using Wearable Sensors and Fuzzy Logic

    Get PDF
    Hand dexterity and motor control are critical in our everyday lives because a significant portion of the daily motions we perform are with our hands and require some degree of repetition and skill. Therefore, development of technologies for hand and extremity rehabilitation is a significant area of research that will directly help patients recovering from hand debilities sustained from causes ranging from stroke and Parkinson’s disease to trauma and common injuries. Cyclic activity recognition and assessment is appropriate for hand and extremity rehabilitation because a majority of our essential motions are cyclic in their nature. For a patient on the road to regaining functional independence with daily skills, the improvement in cyclic motions constitutes an important and quantifiable rehabilitation goal. However, challenges exist with hand rehabilitation sensor technologies preventing acquisition of long-term, continuous, accurate and actionable motion data. These challenges include complicated and uncomfortable system assemblies, and a lack of integration with consumer electronics for easy readout. In our research, we have developed a glove based system where the inertial measurement unit (IMU) sensors are used synergistically with the flexible sensors to minimize the number of IMU sensors. The classification capability of our system is improved by utilizing a fuzzy logic data analysis algorithm. We tested a total of 25 different subjects using a glove-based apparatus to gather data on two-dimensional motions with one accelerometer and three-dimensional motions with one accelerometer and two flexible sensors. Our research provides an approach that has the potential to utilize both activity recognition and activity assessment using simple sensor systems to help patients recover and improve their overall quality of life
    • …
    corecore