Application of Machine Learning in Malaysia Sign Language Translation

Abstract

Sign language serves as a vital form of communication for individuals with hearing impairments, enabling seamless interaction among those who cannot hear. It is a widely used means of communication worldwide, facilitating communication within the deaf community. In Malaysia, Malaysian Sign Language (MySL) prevails as the primary sign language employed by the deaf community. However, sign languages possess unique grammatical rules and structures, making them unfamiliar to hearing individuals, leading to potential misunderstandings and communication barriers. This project aimed at developing a sign language translator capable of translating 24 alphabets based on hand gestures. The system employs a dataset of sign language alphabet images, gathered and trained using the Teachable Machine. To evaluate the translator's performance, response time is thoroughly analyzed. The results indicate that 18 out of the 24 alphabets can be recognized within 5 seconds, displaying promising accuracy. By bridging the communication gap between deaf and hearing individuals, the findings of this study hold substantial potential to enhance interactions and foster better understanding between these two communities. The sign language translator represents a significant step towards inclusive communication and improved accessibility for individuals with hearing impairment

    Similar works