611 research outputs found

    Visual Control System for Robotic Welding

    Get PDF

    A new strategy for improving vision based tracking accuracy based on utilization of camera calibration information

    Get PDF
    Abstract— Camera calibration is one of the essential components of a vision based tracking system where the objective is to extract three dimensional information from a set of two dimensional frames. The information extracted from the calibration process is significant for examining the accuracy of the vision sensor, and thus further for estimating its effectiveness as a tracking system in real applications. This paper introduces another use for this information in which the proper location of the camera can be predicted. Anew mathematical formula based on utilizing the extracted calibration information was used for finding the optimum location for the camera, which provides the best detection accuracy. Moreover, the calibration information was also used for selecting the proper image Denoising filter. The results obtained proved the validity of the proposed formula in finding the desired camera location where the smallest detection errors can be produced. Also, results showed that the proper selection of the filter parameters led to a considerable enhancement in the overall accuracy of the camera, reducing the overall detection error by 0.2 mm

    Spatial Programming for Industrial Robots through Task Demonstration

    Get PDF
    We present an intuitive system for the programming of industrial robots using markerless gesture recognition and mobile augmented reality in terms of programming by demonstration. The approach covers gesture-based task definition and adaption by human demonstration, as well as task evaluation through augmented reality. A 3D motion tracking system and a handheld device establish the basis for the presented spatial programming system. In this publication, we present a prototype toward the programming of an assembly sequence consisting of several pick-and-place tasks. A scene reconstruction provides pose estimation of known objects with the help of the 2D camera of the handheld. Therefore, the programmer is able to define the program through natural bare-hand manipulation of these objects with the help of direct visual feedback in the augmented reality application. The program can be adapted by gestures and transmitted subsequently to an arbitrary industrial robot controller using a unified interface. Finally, we discuss an application of the presented spatial programming approach toward robot-based welding tasks

    Local threshold identification and gray level classification of butt joint welding imperfections using robot vision system

    Get PDF
    This research is carried out be able to automatically identify the joint position and classify the quality level of imperfections for butt welding joint based on background subtraction, local thresholding and gray level approaches without any prior knowledge of the joint shapes. The background subtraction and local thresholding approaches consist of image pre-processing, noise reduction and butt welding representation algorithms. The approaches can automatically recognize and locate the butt joint position of the starting, middle, auxiliary and ending point according to the three different joint shapes; straight line, tooth saw and curved joint shapes. The welding process was done by implemented an automatic coordinate conversion between camera (pixels) and KUKA welding robot coordinate (millimeters) from the KUKA welding robot and camera coordinate ratio. The ratio was determined by a camera and three reference point (origin, x-direction and y-direction) taken around workpiece. Hence, the quality level of imperfection for butt welding joint was classified using Gaussian Mix Model (GMM), Multi-Layer Perceptron (MLP) and Support Vector Machine (SVM) classifiers according to their class of imperfection categories; good welds, excess welds, insufficient welds and no weld in each welding joint shape. These classifiers introduced 72 characteristics of feature values of gray pixels taken from co-occurrence matrix. The feature values consist of energy, correlation, homogeneity and contrast combine with gray absolute histogram of edge amplitude including additional characteristic features with scaled image factor by 0.5. The proposed approaches were validated through experiments with a KUKA welding robot in a realistic workshop environment. The results show that the approaches introduced in this research can detect, identify, recognize, locate the welding position and classify the quality level of imperfections for butt welding joint automatically without any prior knowledge of the joint shapes
    • …
    corecore