10,137 research outputs found

    PENGENALAN ISYARAT TANGAN STATIS PADA SISTEM ISYARAT BAHASA INDONESIA BERBASIS JARINGAN SYARAF TIRUAN PERAMBATAN BALIK

    Get PDF
    Static Hand Gesture Recognition of Indonesian Sign Language System Based on Backpropagation Neural Networks. The main objective of this research is to perform pattern recognition of static hand gesture in Indonesian sign language. Basically, pattern recognition of static hand gesture in the form of image had three phases include: 1) segmentation of the image that will be recognizable form of the hands and face, 2) feature extraction and 3) pattern classification. In this research, we used images data of 15 classes of words static. Segmentation is performed using HSV with a threshold filter based on skin color. Feature extraction performed with the Haar wavelet decomposition filter to level 2. Classification is done by applying the back propagation system of neural network architecture with 4096 neurons in input layer, 75 neurons in hidden layer and 15 neurons in output layer. The system was tested by using 225 data validation and accuracy achieved was 69%.Keywords: artificial neural networks, feature extraction, hand gesture, segmentation, stati

    LATVIAN SIGN LANGUAGE RECOGNITION CLASSIFICATION POSSIBILITIES

    Get PDF
    There is a lack of automated sign language recognition system in Latvia while many other countries have been already equipped with such a system. Latvian deaf society requires support of such a system which would allow people with special needs to enhance their communication in governmental and public places. The aim of this paper is to recognize Latvian sign language alphabet using classification approach with artificial neural networks, which is a first step in developing integral system of Latvian Sign Language recognition. Communication in our daily life is generally vocal, but body language has its own significance. It has many areas of application like sign languages are used for various purposes and in case of people who are deaf and dumb, sign language plays an important role. Gestures are the very first form of communication. The paper presents Sign Language Recognition possibilities with centre of gravity method. So this area influenced us very much to carry on the further work related to hand gesture classification and sign’s clustering

    Designing 2D Interfaces For 3D Gesture Retrieval Utilizing Deep Learning

    Get PDF
    Gesture retrieval can be defined as the process of retrieving the correct meaning of the hand movement from a pre-assembled gesture dataset. The purpose of the research discussed here is to design and implement a gesture interface system that facilitates retrieval for an American Sign Language gesture set using a mobile device. The principal challenge discussed here will be the normalization of 2D gestures generated from the mobile device interface and the 3D gestures captured from video samples into a common data structure that can be utilized by deep learning networks. This thesis covers convolutional neural networks and auto encoders which are used to transform 2D gestures into the correct form, before being classified by a convolutional neural network. The architecture and implementation of the front-end and back-end systems and each of their respective responsibilities are discussed. Lastly, this thesis covers the results of the experiment and breakdown the final classification accuracy of 83% and how this work could be further improved by using depth based videos for the 3D data

    South African sign language dataset development and translation : a glove-based approach

    Get PDF
    Includes bibliographical references.There has been a definite breakdown of communication between the hearing and the Deaf communities. This communication gap drastically effects many facets of a Deaf person’s life, including education, job opportunities and quality of life. Researchers have turned to technology in order to remedy this issue using Automatic Sign Language. While there has been successful research around the world, this is not possible in South Africa as there is no South African Sign Language (SASL) database available. This research aims to develop a SASL static gesture database using a data glove as the first step towards developing a comprehensive database that encapsulates the entire language. Unfortunately commercial data gloves are expensive and so as part of this research, a low-cost data glove will be developed for the application of Automatic Sign Language Translation. The database and data glove will be used together with Neural Networks to perform gesture classification. This will be done in order to evaluate the gesture data collected for the database. This research project has been broken down into three main sections; data glove development, database creation and gesture classification. The data glove was developed by critically reviewing the relevant literature, testing the sensors and then evaluating the overall glove for repeatability and reliability. The final data glove prototype was constructed and five participants were used to collect 31 different static gestures in three different scenarios, which range from isolated gesture collection to continuous data collection. This data was cleaned and used to train a neural network for the purpose of classification. Several training algorithms were chosen and compared to see which attained the highest classification accuracy. The data glove performed well and achieved results superior to some research and on par with other researchers’ results. The data glove achieved a repeatable angle range of 3.27 degrees resolution with a standard deviation of 1.418 degrees. This result is far below the specified 15 degrees resolution required for the research. The device remained low-cost and was more than $100 cheaper than other custom research data gloves and hundreds of dollars cheaper than commercial data gloves. A database was created using five participants and 1550 type 1 gestures, 465 type 2 gestures and 93 type 3 gestures were collected. The Resilient Back-Propagation and Levenberg-Marquardt training algorithms were considered as the training algorithms for the neural network. The Levenberg-Marquardt algorithm had a superior classification accuracy achieving 99.61%, 77.42% and 81.72% accuracy on the type 1, type 2 and type 3 data respectively

    Gesture and sign language recognition with temporal residual networks

    Get PDF
    • …
    corecore