5 research outputs found

    An integrated sign language recognition system

    Get PDF
    Doctor EducationisResearch has shown that five parameters are required to recognize any sign language gesture: hand shape, location, orientation and motion, as well as facial expressions. The South African Sign Language (SASL) research group at the University of the Western Cape has created systems to recognize Sign Language gestures using single parameters. Using a single parameter can cause ambiguities in the recognition of signs that are similarly signed resulting in a restriction of the possible vocabulary size. This research pioneers work at the group towards combining multiple parameters to achieve a larger recognition vocabulary set. The proposed methodology combines hand location and hand shape recognition into one combined recognition system. The system is shown to be able to recognize a very large vocabulary of 50 signs at a high average accuracy of 74.1%. This vocabulary size is much larger than existing SASL recognition systems, and achieves a higher accuracy than these systems in spite of the large vocabulary. It is also shown that the system is highly robust to variations in test subjects such as skin colour, gender and body dimension. Furthermore, the group pioneers research towards continuously recognizing signs from a video stream, whereas existing systems recognized a single sign at a time. To this end, a highly accurate continuous gesture segmentation strategy is proposed and shown to be able to accurately recognize sentences consisting of five isolated SASL gestures

    Autonomous facial expression recognition using the facial action coding system

    Get PDF
    >Magister Scientiae - MScThe South African Sign Language research group at the University of the Western Cape is in the process of creating a fully-edged machine translation system to automatically translate between South African Sign Language and English. A major component of the system is the ability to accurately recognise facial expressions, which are used to convey emphasis, tone and mood within South African Sign Language sentences. Traditionally, facial expression recognition research has taken one of two paths: either recognising whole facial expressions of which there are six i.e. anger, disgust, fear, happiness, sadness, surprise, as well as the neutral expression; or recognising the fundamental components of facial expressions as defined by the Facial Action Coding System in the form of Action Units. Action Units are directly related to the motion of specific muscles in the face, combinations of which are used to form any facial expression. This research investigates enhanced recognition of whole facial expressions by means of a hybrid approach that combines traditional whole facial expression recognition with Action Unit recognition to achieve an enhanced classification approach

    A comparison of machine learning techniques for hand shape recognition

    Get PDF
    >Magister Scientiae - MScThere are five fundamental parameters that characterize any sign language gesture. They are hand shape, orientation, motion and location, and facial expressions. The SASL group at the University of the Western Cape has created systems to recognize each of these parameters in an input video stream. Most of these systems make use of the Support Vector Machine technique for the classification of data due to its high accuracy. It is, however, unknown how other machine learning techniques compare to Support Vector Machines in the recognition of each of these parameters. This research lays the foundation for the process of determining optimum machine learning techniques for each parameter by comparing Support Vector Machines to Artificial Neural Networks and Random Forests in the context of South African Sign Language hand shape recognition. Li, a previous researcher at the SASL group, created a state-of-the-art hand shape recognition system that uses Support Vector Machines to classify hand shapes. This research re-implements Li’s feature extraction procedure but investigates the use of Artificial Neural Networks and Random Forests in the place of Support Vector Machines as a comparison. The machine learning techniques are optimized and trained to recognize ten SASL hand shapes and compared in terms of classification accuracy, training time, optimization time and classification time

    South African Sign Language Hand Shape and Orientation Recognition on Mobile Devices Using Deep Learning

    Get PDF
    >Magister Scientiae - MScIn order to classify South African Sign Language as a signed gesture, five fundamental parameters need to be considered. These five parameters to be considered are: hand shape, hand orientation, hand motion, hand location and facial expressions. The research in this thesis will utilise Deep Learning techniques, specifically Convolutional Neural Networks, to recognise hand shapes in various hand orientations. The research will focus on two of the five fundamental parameters, i.e., recognising six South African Sign Language hand shapes for each of five different hand orientations. These hand shape and orientation combinations will be recognised by means of a video stream captured on a mobile device. The efficacy of Convolutional Neural Network for gesture recognition will be judged with respect to its classification accuracy and classification speed in both a desktop and embedded context. The research methodology employed to carry out the research was Design Science Research. Design Science Research refers to a set of analytical techniques and perspectives for performing research in the field of Information Systems and Computer Science. Design Science Research necessitates the design of an artefact and the analysis thereof in order to better understand its behaviour in the context of Information Systems or Computer Science.National Research Foundation (NRF

    Guidelines for and evaluation of the design of technology-supported lessons to teach basic programming principles to deaf and hard of hearing learners: a case study of a school for the deaf

    Get PDF
    Deaf and Hard of Hearing (DHH) learners are part of a diverse population with unique learning challenges, strengths and needs. Learning material should be developed specifically for them to provide for their needs and capitalise on their strengths. These materials should include visual material and strategies as well as sign language. Furthermore, DHH learners have the same capacity for learning as hearing learners. However, in South Africa, DHH learners do not have adequate access to training in computer-related subjects, and therefore no material exists that has been developed specifically for DHH learners who want to learn a programming language. This research provides guidelines on the way technology-supported lessons can be designed to teach basic programming principles using the programming language Scratch, to DHH learners. Provision was made for the South African context where limited technology is available at most schools for DHH learners, but where most educators have access to Microsoft Office applications – specifically MS PowerPoint. Two goals were pursued. The primary goal of this research project was to determine the user experience (UX) of the participants (both learners and educators) during and after using and attending the technology-supported lessons. This was achieved through a case study. Four UX evaluation elements were evaluated in this project. They were: usability, accessibility, emotional user reaction, and hedonic aspects. Questionnaires, semi-structured interviews as well as participant-observation were used to determine the UX of participants. The UX evaluation provided sufficient evidence to claim that UX of participants was satisfactory, and therefore the guidelines that were developed to create technology-supported lessons to teach basic programming principles to DHH learners were appropriate. The secondary goal was to develop guidelines for the design of technology-supported lessons to teach programming to DHH learners, and to apply these guidelines to develop a high-fidelity, fully functional prototype – a set of technology-supported lessons. This was achieved through a prototype construction research strategy. The lessons consisted of two vocabulary lessons and one programming lesson. The words that were taught in the vocabulary lesson were either terms appearing in the interface of Scratch, or words needed in the explanation of programming principles and Scratch context. The programming lesson (a PowerPoint slide show) was a guide for the educator to present the content in a logical way, and not to leave out important information. It used multimedia techniques (colour, pictures, animation) to explain programming concepts, and to display the tasks to be completed to the learners, so that they could remember the sequence of the steps. Practical strategies have been included in the guidelines to address the learning challenges DHH experience in the following areas: Comprehension skills, application of knowledge and knowledge organisation, relational and individual-item orientations, metacognition, memory, distractibility. The guidelines referred to techniques and principles that can be followed to design the interface and navigation tools of a technology-supported lesson; enhance communication with DHH learners, and provide support for them to work independently; specify the educator’s role and attitude when facilitating or presenting programming lessons and to structure a programming lesson
    corecore