6 research outputs found

    Spatial Programming for Industrial Robots through Task Demonstration

    Get PDF
    We present an intuitive system for the programming of industrial robots using markerless gesture recognition and mobile augmented reality in terms of programming by demonstration. The approach covers gesture-based task definition and adaption by human demonstration, as well as task evaluation through augmented reality. A 3D motion tracking system and a handheld device establish the basis for the presented spatial programming system. In this publication, we present a prototype toward the programming of an assembly sequence consisting of several pick-and-place tasks. A scene reconstruction provides pose estimation of known objects with the help of the 2D camera of the handheld. Therefore, the programmer is able to define the program through natural bare-hand manipulation of these objects with the help of direct visual feedback in the augmented reality application. The program can be adapted by gestures and transmitted subsequently to an arbitrary industrial robot controller using a unified interface. Finally, we discuss an application of the presented spatial programming approach toward robot-based welding tasks

    Human robot collaboration - using kinect v2 for ISO/TS 15066 speed and separation monitoring

    No full text
    The use of industrial robots within assembly workstations where human and robot should be able to collaborate or even cooperate involve high safety requirements. One out of four possibilities outlined in the current technical specification ISO/TS 15066 for ensuring safety is speed and separation monitoring. Here the robot motion speed in- or decreases dynamically depending on the distance between operator and robot. This paper introduces an approach of a speed and separation monitoring system with the help of a time of flight sensing. After introducing this safety ensuring method, a Microsoft Kinect V2 is used to continuously detect human worker within a shared workspace. With the help of the robots joint angles from the robot control it is possible to compute the distances between all robot joints and the human worker. The shortest distance, which is at the same the critical distance time, is determined and as a consequence the velocity and acceleration values of the robot were set to safe values according to ISO/TS 15066. As it is not necessary to visually detect also the robot, but only human workers, this approach is very resilient. Afterwards the introduced setup is tested by a real detected human in front of a Kinect and a simulated industrial robot (Universal Robot UR5) in the robot operating system ROS. Measurements show that depending on the position of the worker the robots speed adapts to recommended safety values up to a complete halt if necessary. Conclusively all results are discussed and an outlook for possible fields of applications is given
    corecore