16 research outputs found

    Signature Recognition and Verification with ANN Using Moment Invariant Method

    No full text
    In this paper, we present an off-line signature recognition and verification system which is based on moment invariant method and ANN. Two separate neural networks are designed; one for signature recognition, and another for verification (i.e. for detecting forgery). Both networks use a four-step process. First step is to separate the signature from its background. Second step performs normalization and digitization of the original signature. Moment invariant vectors are obtained in the third step. And the last step implements signature recognition and verification

    A Practical License Plate Recognition System for Real-Time Environments

    No full text
    A computer vision system to recognize license plates of vehicles in real-time environments is presented in this study. The images of moving vehicles are taken with a digital camera and analyzed in real-time. An artificial neural network (ANN) system is used to locate the area and position of the license plate. The system has the following stages: (i) Image acquisition and determination of the location of the vehicle license plate (VLP), (ii) segmentation of the VLP into separate characters using image processing techniques, and (iii) recognition of each symbol in VLP using a feedforward artificial neural network (ANN) and assembly of the characters. Performance results are presented at the end

    Recognition of Finger Spelling of American Sign Language with Artificial Neural Network Using Position/Orientation Sensors and Data Glove

    No full text
    An American Sign Language (ASL) finger spelling and an alphabet gesture recognition system was designed with ANN and constructed in order to translate the ASL alphabet into the corresponding printed and sounded English letters. The system uses a sensory Cyberglove and a Flock of Birds 3-D motion tracker to extract the gestures. The finger joint angle data obtained from strain gauges in the sensory glove define the hand shape while the data from the tracker describes the trajectory and orientation. The data flow from these devices is controlled by a motion trigger. Then, data is processed by an alphabet recognition network to generate the words and names. Our goal is to establish an ASL finger spelling system using these devices in real time. We trained and tested our system for ASL alphabet, names and word spelling. Our test results show that the accuracy of recognition is 96%

    American Sign Language Word Recognition with a Sensory Glove using Artificial Neural Networks

    No full text
    An American Sign Language (ASL) recognition system is being developed using artificial neural networks (ANNs) to translate ASL words into English. The system uses a sensory glove called the Cyberglove (TM) and a Flock of Birds (R) 3-D motion tracker to extract the gesture features. The data regarding finger joint angles obtained from strain gauges in the sensory glove define the hand shape, while the data from the tracker describe the trajectory of hand movements. The data from these devices are processed by a velocity network with noise reduction and feature extraction and by a word recognition network. Some global and local features are extracted for each ASL word. A neural network is used as a classifier of this feature vector. Our goal is to continuously recognize ASL signs using these devices in real time. We trained and tested the ANN model for 50 ASL words with a different number of samples for every word. The test results show that our feature vector extraction method and neural networks can be used successfully for isolated word recognition. This system is flexible and open for future extension. (C) 2011 Elsevier Ltd. All rights reserved

    Linguistic Properties Based on American Sign Language Isolated Word Recognition with Artificial Neural Networks Using a Sensory Glove and Motion Tracker

    No full text
    Sign language (SL), which is a highly visual-spatial, linguistically complete, and natural language, is the main mode of communication among deaf people. Described in this paper are two different American Sign Language (ASL) word recognition systems developed using artificial neural networks (ANN) to translate the ASL words into English. Feature vectors of signing words taken at five time instants were used in the first system, while histograms of feature vectors of signing words were used in the second system. The systems use a sensory glove, Cyberglove™, and a Flock of Birds® 3-D motion tracker to extract the gesture features. The finger joint angle data obtained from strain gauges in the sensory glove define the hand shape, and the data from the tracker describe the trajectory of hand movement. In both systems, the data from these devices were processed by two neural networks: a velocity network and a word recognition network. The velocity network uses hand speed to determine the duration of words. Signs are defined by feature vectors such as hand shape, hand location, orientation, movement, bounding box, and distance. The second network was used as a classifier to convert ASL signs into words based on features or histograms of these features. We trained and tested our ANN models with 60 ASL words for a different number of samples. These methods were compared with each other. Our test results show that the accuracy of recognition of these two systems is 92% and 95%, respectively

    Signature Recognition and Verification with ANN

    No full text
    In this paper, we present an off-line signature recognition and verification system which is based on moment invariant method and ANN. Two separate neural networks are designed; one for signature recognition, and another for verification (i.e. for detecting forgery). Both networks use a four-step process. First step is to separate the signature from its background. Second step performs normalization and digitization of the original signature. Moment invariant vectors are obtained in the third step. And the last step implements signature recognition and verification

    A Computer Vision and HCI System for Robotic Arm Control

    No full text
    We developed a computer vision based object classification method and a human computer interaction (HCI) system which uses a Cyber Glove and a Flock of Birds® motion tracker to translate hand gestures into commands for a robotic arm. Our system consists of a conveyor, a simulated robotic manipulator, an HCI setup, and a vision system. First, the speed and the position of the objects are dynamically computed on a moving conveyor using standard motion estimation techniques. Then, objects are classified by an artificial neural network (ANN) using various image processing techniques and a moment invariant method. After the classification step, object specific commands are issued using hand gestures to further control the robotic arm which are translated into a command sequence via the HCI system. The system is successfully tested for different objects moving on a conveyor. The hardware implementation and the overview of the algorithms used along with the results obtained are presented

    The Anomaly- and Signature-Based IDS for Network Security Using Hybrid Inference Systems

    No full text
    With the expansion of communication in today’s world and the possibility of creating interactions between people through communication networks regardless of the distance dimension, the issue of creating security for the data and information exchanged has received much attention from researchers. Various methods have been proposed for this purpose; one of the most important methods is intrusion detection systems to quickly detect intrusions into the network and inform the manager or responsible people to carry out an operational set to reduce the amount of damage caused by these intruders. The main challenge of the proposed intrusion detection systems is the number of erroneous warning messages generated and the low percentage of accurate detection of intrusions in them. In this research, the Suricata IDS/IPS is deployed along with the NN model for the metaheuristic’s manual detection of malicious traffic in the targeted network. For the metaheuristic-based feature selection, the neural network, and the anomaly-based detection, the fuzzy logic is used in this research paper. The latest stable version of Kali Linux 2020.3 is used as an attacking system for web applications and different types of operating systems. The proposed method has achieved 96.111% accuracy for detecting network intrusion

    American Sign Language Recognition Using Multi-Dimensional Hidden Markov Models

    No full text
    An American Sign Language (ASL) recognition system developed based on multi-dimensional Hidden Markov Models (HMM) is presented in this paper. A Cyberglove™ sensory glove and a Flock of Birds® motion tracker are used to extract the features of ASL gestures. The data obtained from the strain gages in the glove defines the hand shape while the data from the motion tracker describes the trajectory of hand movement. Our objective is to continuously recognize ASL gestures using these input devices in real time. with the features extracted from the sensory data, we specify multi-dimensional states for ASL signs in the HMM processor. The system gives an average of 95% correct recognition for the 26 alphabets and 36 basic handshapes in the ASL after it has been trained with 8 samples. New gestures can be accommodated in the system with an interactive learning processor. The developed system forms a sound foundation for continuous recognition of ASL full signs

    American Sign Language Word Recognition with a Sensory Glove Using Artificial Neural Networks

    No full text
    An American Sign Language (ASL) recognition system is being developed using artificial neutral networks (ANN) to translate the ASL words into English. The system uses a sensory glove CybergloveTM and a flock of Bird 3-D motion tracker to extract the gesture features. The finger joint angle data obtained from strain gages in the sensory glove defines the handshape while the data from the tracker describes the trajectory of hand movement. The data from these devices is processed by two neural networks, a velocity network and a word recognition network. Our goal is to continuously recognize ASL gestures using these devices in real time. We trained and tested our ANN model for 50 ASL word for different number of samples. Our test results show that the accuracy of recognition is 94%
    corecore