2,098 research outputs found

    Game with Hand Gesture Control

    Get PDF
    Tato práce je zaměřena na ovládání her pomocí gest ruky. Stěžejní částí práce je segmentace obrazu a detekce ruky v obraze. Pro segmentaci obrazu jsou použity techniky detekce kůže a odečítání pozadí s adaptivním modelem pozadí. Jsou zmíněny i metody matematické morfologie pro odstranění šumu z obrazu a metody vhodné pro převedení obrazu gesta na charakteristické vlastnosti gesta v číselné podobě. V rámci práce byla vytvořena jednoduchá hra na téma automobilové závody, která je ovládána pomocí gest ruky. Na závěr bylo provedeno testování s cílem zjistit výhody a nevýhody použitých metod segmentace obrazu za účelem rozpoznání gest ruky. Bylo také testováno několik sad gest, jejichž prostřednictvím je hra ovládána. Z tohoto testování vyšly nejúspěšněji dvě sady gest, které jsou použitelné v závislosti na kvalitě rozpoznání gesta ruky.This work is focused on controlling the game by using hand gestures. The main part of the work is image segmentation and detection of the hand in picture. For the segmentation of the image are used techniques of skin detection and the background subtraction with adaptive model of the background. Also the methods of mathematical morphology to eliminate the noise from the image and the appropriate methods for transferring images of gestures to characteristic gestures in numerical form are mentioned. In the context of the work was a simple car race game created which is controlled by hand gestures. At the end there was a testing carried out to identify the advantages and disadvantages of used methods of image segmentation and to detect the used hand gestures. There were also several sets of gestures tested by which the game is controlled. The two sets which came out of the test most successfully are applicable depending on the quality of the hand gesture recognition.

    The design and evaluation of an ergonomic contactless gesture control system for industrial robots

    Get PDF
    In industrial human-robot collaboration, variability commonly exists in the operation environment and the components, which induces uncertainty and error that require frequent manual intervention for rectification. Conventional teach pendants can be physically demanding to use and require user training prior to operation. Thus, a more effective control interface is required. In this paper, the design and evaluation of a contactless gesture control system using Leap Motion is described. The design process involves the use of RULA human factor analysis tool. Separately, an exploratory usability test was conducted to compare three usability aspects between the developed gesture control system and an off-the-shelf conventional touchscreen teach pendant. This paper focuses on the user-centred design methodology of the gesture control system. The novelties of this research are the use of human factor analysis tools in the human-centred development process, as well as the gesture control design that enable users to control industrial robot’s motion by its joints and tool centre point position. The system has potential to use as an input device for industrial robot control in a human-robot collaboration scene. The developed gesture control system was targeting applications in system recovery and error correction in flexible manufacturing environment shared between humans and robots. The system allows operators to control an industrial robot without the requirement of significant training

    Gesture Control of Cyber Physical Systems

    Get PDF

    Gesture Recognition and Control Part 3 - WiFi Oriented Gesture Control & its application

    Get PDF
    This Exploratory Survey paper explore the basic princip le behind Wi F i oriented Gesture Control System. The paper briefly provided the literature review about this latest technology. That is technology having vast applications in real time situation like in Gaming, Home automation, Medicine for disabled & latest electronic gadgets. The researcher from University of Washington has done a miles tone work for this technology . I t will be expected that i n 2020 era, th e W i F i based Gesture Control & Recognition system replace all other Man - Machine interface methods

    Nonverbal Social Communication and Gesture Control in Schizophrenia

    Get PDF
    Schizophrenia patients are severely impaired in nonverbal communication, including social perception and gesture production. However, the impact of nonverbal social perception on gestural behavior remains unknown, as is the contribution of negative symptoms, working memory, and abnormal motor behavior. Thus, the study tested whether poor nonverbal social perception was related to impaired gesture performance, gestural knowledge, or motor abnormalities. Forty-six patients with schizophrenia (80%), schizophreniform (15%), or schizoaffective disorder (5%) and 44 healthy controls matched for age, gender, and education were included. Participants completed 4 tasks on nonverbal communication including nonverbal social perception, gesture performance, gesture recognition, and tool use. In addition, they underwent comprehensive clinical and motor assessments. Patients presented impaired nonverbal communication in all tasks compared with controls. Furthermore, in contrast to controls, performance in patients was highly correlated between tasks, not explained by supramodal cognitive deficits such as working memory. Schizophrenia patients with impaired gesture performance also demonstrated poor nonverbal social perception, gestural knowledge, and tool use. Importantly, motor/frontal abnormalities negatively mediated the strong association between nonverbal social perception and gesture performance. The factors negative symptoms and antipsychotic dosage were unrelated to the nonverbal tasks. The study confirmed a generalized nonverbal communication deficit in schizophrenia. Specifically, the findings suggested that nonverbal social perception in schizophrenia has a relevant impact on gestural impairment beyond the negative influence of motor/frontal abnormalitie

    Gesture Control of a Mobile Robot using Kinect Sensor

    Get PDF
    This paper describes a methodology for gesture control of a custom developed mobile robot, using body gestures and Microsoft Kinect sensor. The Microsoft Kinect sensor’s ability is to track joint positions has been used in order to develop software application gestures recognition and their mapping into control commands. The proposed methodology has been experimentally evaluated. The results of the experimental evaluation, presented in the paper, showed that the proposed methodology is accurate and reliable and it could be used for mobile robot control in practical applications

    Multi-Operator Gesture Control of Robotic Swarms Using Wearable Devices

    Get PDF
    The theory and design of effective interfaces for human interaction with multi-robot systems has recently gained significant interest. Robotic swarms are multi-robot systems where local interactions between robots and neighbors within their spatial neighborhood generate emergent collective behaviors. Most prior work has studied interfaces for human interaction with remote swarms, but swarms also have great potential in applications working alongside humans, motivating the need for interfaces for local interaction. Given the collective nature of swarms, human interaction may occur at many levels of abstraction ranging from swarm behavior selection to teleoperation. Wearable gesture control is an intuitive interaction modality that can meet this requirement while keeping operator hands usually unencumbered. In this paper, we present an interaction method using a gesture-based wearable device with a limited number of gestures for robust control of a complex system: a robotic swarm. Experiments conducted with a real robot swarm compare performance in single and two-operator conditions illustrating the effectiveness of the method. Results show human operators using our interaction method are able to successfully complete the task in all trials, illustrating the effectiveness of the method, with better performance in the two-operator condition, indicating separation of function is beneficial for our method. The primary contribution of our work is the development and demonstration of interaction methods that allow robust control of a difficult to understand multi robot system using only the noisy inputs typical of smartphones and other on-body sensor driven devices

    Wearable Capacitive-based Wrist-worn Gesture Sensing System

    Get PDF
    Gesture control plays an increasingly significant role in modern human-machine interactions. This paper presents an innovative method of gesture recognition using flexible capacitive pressure sensor attached on user’s wrist towards computer vision and connecting senses on fingers. The method is based on the pressure variations around the wrist when the gesture changes. Flexible and ultrathin capacitive pressure sensors are deployed to capture the pressure variations. The embedding of sensors on a flexible substrate and obtain the relevant capacitance require a reliable approach based on a microcontroller to measure a small change of capacitive sensor. This paper is addressing these challenges, collect and process the measured capacitance values through a developed programming on LabVIEW to reconstruct the gesture on computer. Compared to the conventional approaches, the wrist-worn sensing method offerings a low-cost, lightweight and wearable prototype on the user’s body. The experimental result shows that the potentiality and benefits of this approach and confirms that accuracy and number of recognizable gestures can be improved by increasing number of sensor

    Gaze modulated disambiguation technique for gesture control in 3D virtual objects selection

    Get PDF
    © 2017 IEEE. Inputs with multimodal information provide more natural ways to interact with virtual 3D environment. An emerging technique that integrates gaze modulated pointing with mid-air gesture control enables fast target acquisition and rich control expressions. The performance of this technique relies on the eye tracking accuracy which is not comparable with the traditional pointing techniques (e.g., mouse) yet. This will cause troubles when fine grainy interactions are required, such as selecting in a dense virtual scene where proximity and occlusion are prone to occur. This paper proposes a coarse-to-fine solution to compensate the degradation introduced by eye tracking inaccuracy using a gaze cone to detect ambiguity and then a gaze probe for decluttering. It is tested in a comparative experiment which involves 12 participants with 3240 runs. The results show that the proposed technique enhanced the selection accuracy and user experience but it is still with a potential to be improved in efficiency. This study contributes to providing a robust multimodal interface design supported by both eye tracking and mid-air gesture control
    • …
    corecore