1,250 research outputs found

    Hand-Gesture Based Programming of Industrial Robot Manipulators

    Get PDF
    Nowadays, industrial robot manipulators and manufacturing processes are associated as never before. Robot manipulators execute repetitive tasks with increased accuracy and speed, features necessary for industries with needs for manufacturing of products in large quantities by reducing the production time. Although robot manipulators have a significant role for the enhancement of productivity within industries, the programming process of the robot manipulators is an important drawback. Traditional programming methodologies requires robot programming experts and are time consuming. This thesis work aims to develop an application for programming industrial robot manipulators excluding the need of traditional programing methodologies exploiting the intuitiveness of humans’ hands’ gestures. The development of input devices for intuitive Human-Machine Interactions provides the possibility to capture such gestures. Hence, the need of the need of robot manipulator programming experts can be replaced by task experts. In addition, the integration of intuitive means of interaction can reduce be also reduced. The components to capture the hands’ operators’ gestures are a data glove and a precise hand-tracking device. The robot manipulator imitates the motion that human operator performs with the hand, in terms of position. Inverse kinematics are applied to enhance the programming of robot manipulators in-dependently of their structure and manufacturer and researching the possibility for optimizing the programmed robot paths. Finally, a Human-Machine Interface contributes in the programming process by offering important information for the programming process and the status of the integrated components

    The development of a human-robot interface for industrial collaborative system

    Get PDF
    Industrial robots have been identified as one of the most effective solutions for optimising output and quality within many industries. However, there are a number of manufacturing applications involving complex tasks and inconstant components which prohibit the use of fully automated solutions in the foreseeable future. A breakthrough in robotic technologies and changes in safety legislations have supported the creation of robots that coexist and assist humans in industrial applications. It has been broadly recognised that human-robot collaborative systems would be a realistic solution as an advanced production system with wide range of applications and high economic impact. This type of system can utilise the best of both worlds, where the robot can perform simple tasks that require high repeatability while the human performs tasks that require judgement and dexterity of the human hands. Robots in such system will operate as “intelligent assistants”. In a collaborative working environment, robot and human share the same working area, and interact with each other. This level of interface will require effective ways of communication and collaboration to avoid unwanted conflicts. This project aims to create a user interface for industrial collaborative robot system through integration of current robotic technologies. The robotic system is designed for seamless collaboration with a human in close proximity. The system is capable to communicate with the human via the exchange of gestures, as well as visual signal which operators can observe and comprehend at a glance. The main objective of this PhD is to develop a Human-Robot Interface (HRI) for communication with an industrial collaborative robot during collaboration in proximity. The system is developed in conjunction with a small scale collaborative robot system which has been integrated using off-the-shelf components. The system should be capable of receiving input from the human user via an intuitive method as well as indicating its status to the user ii effectively. The HRI will be developed using a combination of hardware integrations and software developments. The software and the control framework were developed in a way that is applicable to other industrial robots in the future. The developed gesture command system is demonstrated on a heavy duty industrial robot

    Spatial Programming for Industrial Robots through Task Demonstration

    Get PDF
    We present an intuitive system for the programming of industrial robots using markerless gesture recognition and mobile augmented reality in terms of programming by demonstration. The approach covers gesture-based task definition and adaption by human demonstration, as well as task evaluation through augmented reality. A 3D motion tracking system and a handheld device establish the basis for the presented spatial programming system. In this publication, we present a prototype toward the programming of an assembly sequence consisting of several pick-and-place tasks. A scene reconstruction provides pose estimation of known objects with the help of the 2D camera of the handheld. Therefore, the programmer is able to define the program through natural bare-hand manipulation of these objects with the help of direct visual feedback in the augmented reality application. The program can be adapted by gestures and transmitted subsequently to an arbitrary industrial robot controller using a unified interface. Finally, we discuss an application of the presented spatial programming approach toward robot-based welding tasks

    Multimodal human hand motion sensing and analysis - a review

    Get PDF

    A Survey of Applications and Human Motion Recognition with Microsoft Kinect

    Get PDF
    Microsoft Kinect, a low-cost motion sensing device, enables users to interact with computers or game consoles naturally through gestures and spoken commands without any other peripheral equipment. As such, it has commanded intense interests in research and development on the Kinect technology. In this paper, we present, a comprehensive survey on Kinect applications, and the latest research and development on motion recognition using data captured by the Kinect sensor. On the applications front, we review the applications of the Kinect technology in a variety of areas, including healthcare, education and performing arts, robotics, sign language recognition, retail services, workplace safety training, as well as 3D reconstructions. On the technology front, we provide an overview of the main features of both versions of the Kinect sensor together with the depth sensing technologies used, and review literatures on human motion recognition techniques used in Kinect applications. We provide a classification of motion recognition techniques to highlight the different approaches used in human motion recognition. Furthermore, we compile a list of publicly available Kinect datasets. These datasets are valuable resources for researchers to investigate better methods for human motion recognition and lower-level computer vision tasks such as segmentation, object detection and human pose estimation

    Real – Time Hand Gesture Tracking for Network Centric Application

    Get PDF
    This paper focuses on the Real – Time Gesture Tracking for Network Centric Application.  In it, human hand gesture was acquired using Kinect depth camera.  The user posed in front of the camera about 2 meters away from the camera mounted about 80 cm above ground level.  The acquired image is processed to extract the right hand joints vertical and horizontal coordinates, which is transmitted over a network medium.  The received information is then classified and assigned to a sub routine that is meant to perform a defined task. Keywords: Real – Time; Gesture; Network; Tracking; Human machine Interaction (HMI)

    Development of a methodology for the human-robot interaction based on vision systems for collaborative robotics

    Get PDF
    L'abstract è presente nell'allegato / the abstract is in the attachmen
    • …
    corecore