2 research outputs found

    An Interactive American Sign Language Dictionary: Using Neural Networks to Evaluate ASL

    No full text
    American Sign Language is the language of the Deaf and hard-of-hearing in the United States, and those who wish to communicate with this demographic of people need to learn ASL. In lieu of a real teacher, there is a wealth of online resources that one can consult in learning sign language -- module-based learning platforms, video dictionaries, YouTube tutorials -- but they are nearly all references in the English-to-ASL direction. This project is an attempt to tackle the much larger, much grander concept of an ASL-to-English video dictionary, one that would accept videos of isolated ASL signs and output their meaning in English. My thesis takes on a small chunk of this larger project -- it trains and tests a very simple Convolutional Neural Network on frames 10 different ASL signs that undergo little preprocessing. As neural networks have not yet been applied, in their pure form and without the help of a Microsoft Kinect or LeapMotion Controller, to American Sign Language data, this project is the first of its kind
    corecore