4 research outputs found

    Connected Component Algorithm for Gestures Recognition

    Get PDF
    This paper presents head and hand gestures recognition system for Human Computer Interaction (HCI). Head and Hand gestures are an important modality for human computer interaction. Vision based recognition system can give computers the capability of understanding and responding to the hand and head gestures. The aim of this paper is the proposal of real time vision system for its application within a multimedia interaction environment. This recognition system consists of four modules, i.e. capturing the image, image extraction, pattern matching and command determination. If hand and head gestures are shown in front of the camera, hardware will perform respective action. Gestures are matched with the stored database of gestures using pattern matching. Corresponding to matched gesture, the hardware is moved in left, right, forward and backward directions. An algorithm for optimizing connected component in gesture recognition is proposed, which makes use of segmentation in two images. Connected component algorithm scans an image and group its pixels into component based on pixel connectivity i.e. all pixels in connected component share similar pixel intensity values and are in some way connected with each other. Once all groups have been determined, each pixel is labeled with a color according to component it was assigned to

    USING HAND RECOGNITION IN TELEROBOTICS

    Get PDF
    The objective of this project is to recognize selected hand gestures and imitate the recognized hand gesture using a robot. A telerobotics system that relies on computer vision to create the human-machine interface was build. Hand tracking was used as an intuitive control interface, as it represents a natural interaction medium. The system tracks the hand of the operator and the gesture it represents, and relays the appropriate signal to the robot to perform the respective action, in real time. The study focuses on two gestures, open hand, and closed hand, as the NAO robot is not equipped with a dexterous hand. Numerous object recognition algorithms were compared and the SURF based object detector was used. The system was successfully implemented, and was able to recognise the two gestures in 3D space using images from a 2D video camera

    USING HAND RECOGNITION IN TELEROBOTICS

    Get PDF
    The objective of this project is to recognize selected hand gestures and imitate the recognized hand gesture using a robot. A telerobotics system that relies on computer vision to create the human-machine interface was build. Hand tracking was used as an intuitive control interface, as it represents a natural interaction medium. The system tracks the hand of the operator and the gesture it represents, and relays the appropriate signal to the robot to perform the respective action, in real time. The study focuses on two gestures, open hand, and closed hand, as the NAO robot is not equipped with a dexterous hand. Numerous object recognition algorithms were compared and the SURF based object detector was used. The system was successfully implemented, and was able to recognise the two gestures in 3D space using images from a 2D video camera

    DESIGN AND EVALUATION OF A NONVERBAL COMMUNICATION PLATFORM BETWEEN ASSISTIVE ROBOTS AND THEIR USERS

    Get PDF
    Assistive robotics will become integral to the everyday lives of a human population that is increasingly mobile, older, urban-centric and networked. The overwhelming demands on healthcare delivery alone will compel the adoption of assistive robotics. How will we communicate with such robots, and how will they communicate with us? This research makes the case for a relatively \u27artificial\u27 mode of nonverbal human-robot communication that is non-disruptive, non-competitive, and non-invasive human-robot communication that we envision will be willingly invited into our private and working lives over time. This research proposes a non-verbal communication (NVC) platform be conveyed by familiar lights and sounds, and elaborated here are experiments with our NVC platform in a rehabilitation hospital. This NVC is embedded into the Assistive Robotic Table (ART), developed within our lab, that supports the well-being of an expanding population of older adults and those with limited mobility. The broader aim of this research is to afford people robot-assistants that exist and interact with them in the recesses, rather than in the foreground, of their intimate and social lives. With support from our larger research team, I designed and evaluated several alternative modes of nonverbal robot communication with the objective of establishing a nonverbal, human-robot communication loop that evolves with users and can be modified by users. The study was conducted with 10-13 clinicians -- doctors and occupational, physical, and speech therapists -- at a local rehabilitation hospital through three iterative design and evaluation phases and a final usability study session. For our test case at a rehabilitation hospital, medical staff iteratively refined our NVC platform, stated a willingness to use our platform, and declared NVC as a desirable research path. In addition, these clinicians provided the requirements for human-robot interaction (HRI) in clinical settings, suggesting great promise for our mode of human-robot communication for this and other applications and environments involving intimate HRI
    corecore