1,762 research outputs found

    Virtual reality: Theoretical basis, practical applications

    Get PDF
    Virtual reality (VR) is a powerful multimedia visualization technique offering a range of mechanisms by which many new experiences can be made available. This paper deals with the basic nature of VR, the technologies needed to create it, and its potential, especially for helping disabled people. It also offers an overview of some examples of existing VR systems

    Gesture Recognition and Control Part 1 - Basics, Literature Review & Different Techniques

    Get PDF
    This Exploratory paper series reveals the technological aspects of Gesture Controlled User Interface (GCUI), and identifies trends in technology, application and usability. It is found that GCUI now affords realistic opportunities for specific application are as, and especially for use rs who are uncomfortable with more commonly used input devices. It further explored collated chronograph research information on which covers the past research work in Literature Review . Researchers investigated different types of gestures, its uses, applic ations, technology, issues and results from existing research

    Low Cost Open Source Modal Virtual Environment Interfaces Using Full Body Motion Tracking and Hand Gesture Recognition

    Get PDF
    Virtual environments provide insightful and meaningful ways to explore data sets through immersive experiences. One of the ways immersion is achieved is through natural interaction methods instead of only a keyboard and mouse. Intuitive tracking systems for natural interfaces suitable for such environments are often expensive. Recently however, devices such as gesture tracking gloves and skeletal tracking systems have emerged in the consumer market. This project integrates gestural interfaces into an open source virtual reality toolkit using consumer grade input devices and generates a set of tools to enable multimodal gestural interface creation. The AnthroTronix AcceleGlove is used to augment body tracking data from a Microsoft Kinect with fine grained hand gesture data. The tools are found to be useful as a sample gestural interface is implemented using them. The project concludes by suggesting studies targeting gestural interfaces using such devices as well as other areas for further research

    Measurement of the Flexible Bending Force of the Index and Middle Fingers for Virtual Interaction

    Get PDF
    AbstractIn this paper the development of a new low cost dataglove based on fingertip bending tracking techniques for measuring the fingers bending on various virtual interaction activities is presented as an alternative to the rehabilitation services enhancement in the betterment of the quality of life especially for the disabled person. The purpose of the research is to design a flexible control for measurement study of virtual interaction of index and middle fingers that are important in a variety of contexts as well as the deterministic approach. These analyses of fingers flexing of the system were using the flexible bend sensor functioning as a key intermediate of the process to track the fingertip positions and orientations. The main propose of the low cost dataglove is to provide natural input control of interaction in virtual, multimodal and tele-presence environments as an input devices provide as they can monitor the dexterity and flexibility characteristics of the human hand motion. Preliminary experimental results have shown that the dataglove capable to measure several human Degree of Freedom (DoF), “translating” them into commands for the interaction in the virtual world

    Low Cost Open Source Modal Virtual Environment Interfaces Using Full Body Motion Tracking and Hand Gesture Recognition

    Get PDF
    Virtual environments provide insightful and meaningful ways to explore data sets through immersive experiences. One of the ways immersion is achieved is through natural interaction methods instead of only a keyboard and mouse. Intuitive tracking systems for natural interfaces suitable for such environments are often expensive. Recently however, devices such as gesture tracking gloves and skeletal tracking systems have emerged in the consumer market. This project integrates gestural interfaces into an open source virtual reality toolkit using consumer grade input devices and generates a set of tools to enable multimodal gestural interface creation. The AnthroTronix AcceleGlove is used to augment body tracking data from a Microsoft Kinect with fine grained hand gesture data. The tools are found to be useful as a sample gestural interface is implemented using them. The project concludes by suggesting studies targeting gestural interfaces using such devices as well as other areas for further research

    STUDY OF HAND GESTURE RECOGNITION AND CLASSIFICATION

    Get PDF
    To recognize different hand gestures and achieve efficient classification to understand static and dynamic hand movements used for communications.Static and dynamic hand movements are first captured using gesture recognition devices including Kinect device, hand movement sensors, connecting electrodes, and accelerometers. These gestures are processed using hand gesture recognition algorithms such as multivariate fuzzy decision tree, hidden Markov models (HMM), dynamic time warping framework, latent regression forest, support vector machine, and surface electromyogram. Hand movements made by both single and double hands are captured by gesture capture devices with proper illumination conditions. These captured gestures are processed for occlusions and fingers close interactions for identification of right gesture and to classify the gesture and ignore the intermittent gestures. Real-time hand gestures recognition needs robust algorithms like HMM to detect only the intended gesture. Classified gestures are then compared for the effectiveness with training and tested standard datasets like sign language alphabets and KTH datasets. Hand gesture recognition plays a very important role in some of the applications such as sign language recognition, robotics, television control, rehabilitation, and music orchestration

    Hand gesture recognition using Kinect.

    Get PDF
    Hand gesture recognition (HGR) is an important research topic because some situations require silent communication with sign languages. Computational HGR systems assist silent communication, and help people learn a sign language. In this thesis. a novel method for contact-less HGR using Microsoft Kinect for Xbox is described, and a real-time HCR system is implemented with Microsoft Visual Studio 2010. Two different scenarios for HGR are provided: the Popular Gesture with nine gestures, and the Numbers with nine gestures. The system allows the users to select a scenario, and it is able to detect hand gestures made by users. to identify fingers, and to recognize the meanings of gestures, and to display the meanings and pictures on screen. The accuracy of the HGR system is from 84% to 99% with single hand gestures, and from 90% to 100% if both hands perform the same gesture at the same time. Because the depth sensor of Kinect is an infrared camera, the lighting conditions. signers\u27 skin colors and clothing, and background have little impact on the performance of this system. The accuracy and the robustness make this system a versatile component that can be integrated in a variety of applications in daily life
    • …
    corecore