30 research outputs found

    Representación y caracterización de la respuesta temporal de los sistemas de segundo orden

    Full text link
    Este art√≠culo docente define los sistemas de segundo orden sin ceros en el dominio temporal y en el dominio de Laplace. Al terminar el art√≠culo, el alumno ser√° capaz de definir y obtener matem√°ticamente las caracter√≠sticas temporales de este tipo de sistemas. El alumno podr√° adem√°s aprender las propiedades principales de este √ļltimo con ejemplos y preguntas interactivas a lo largo del art√≠culo.Solanes Galbis, JE.; Gracia Calandin, LI. (2022). Representaci√≥n y caracterizaci√≥n de la respuesta temporal de los sistemas de segundo orden. http://hdl.handle.net/10251/18481

    Detecting dings and dents on specular car body surfaces based on optical flow

    Full text link
    [EN] This paper introduces a new approach to detect defects cataloged as dings and dents on car body surfaces, which is currently one of the most important issues facing quality control in the automotive industry. Using well-known optical flow algorithms and the deflectometry principle, the method proposed in this work is able to detect all kind of anomalies on specular surfaces. Hence, our method consists of two main steps: first, in the pre-processing step, light patterns projected on the body surface sweep uniformly the area of inspection, whilst a new image fusion law, based on optical flow, is used to obtain a resulting fused image holding the information of all variations suffered by the projected patterns during the sweeping process, indicating the presence of anomalies; second, a new post-processing step is proposed that avoids the need of using pre-computed reference backgrounds in order to differentiate defects from other body features such as style-lines. To that end, the image background of the resulting fused image is estimated in the first place through a method based on blurring the image according to the direction of each pixel. Afterwards, the estimated image background is used in a new subtraction law through which defects are well differentiated from other surface deformations, allowing the detection of defects in the entire illuminated area. In addition, since our approach, together with the system used, computes defects in less than 15 s, it satisfies the assembly plants time requirements. Experimental results presented in this paper are obtained from the industrial automatic quality control system QEyeTunnel employed in the production line at the Mercedes-Benz factory in Vitoria, Spain. A complete analysis of the algorithm performance will be shown here, together with several tests proving the robustness and reliability of our proposal.This work is supported by VALi+d (APOSTD/2016/044) and PROMETEO (PROMETEOII/2014/044) Programs, both from Conselleria d'Educacio, Generalitat Valenciana.Arnal-Benedicto, L.; Solanes Galbis, JE.; Molina, J.; Tornero Montserrat, J. (2017). Detecting dings and dents on specular car body surfaces based on optical flow. Journal of Manufacturing Systems. 45:306-321. https://doi.org/10.1016/j.jmsy.2017.07.006S3063214

    On the detection of defects on specular car body surfaces

    Full text link
    [EN] The automatic detection of small defects (of up to 0.2 mm in diameter) on car body surfaces following the painting process is currently one of the greatest issues facing quality control in the automotive industry. Although several systems have been developed during the last decade to provide a solution to this problem, these, to the best of our knowledge, have been focused solely on flat surfaces and have been unable to inspect other parts of the surfaces, namely style lines, edges and corners as well as deep concavities. This paper introduces a novel approach using deflectometry- and vision-based technologies in order to overcome this problem and ensure that the whole area is inspected. Moreover, since our approach, together with the system used, computes defects in less than 15 s, it satisfies cycle time production requirements (usually of around 30 s per car). Hence, a two-step algorithm is presented here: in the first step, a new pre-processing step (image fusion algorithm) is introduced to enhance the contrast between pixels with a low level of intensity (indicating the presence of defects) and those with a high level of intensity (indicating the absence of defects); for the second step, we present a novel post-processing step with an image background extraction approach based on a local directional blurring method and a modified image contrast enhancement, which enables detection of defects in the entire illuminated area. In addition, the post-processing step is processed several times using a multi-level structure, with computed image backgrounds of different resolution. In doing so, it is possible to detect larger defects, given that each level identifies defects of different sizes. Experimental results presented in this paper are obtained from the industrial automatic quality control system QEyeTunnel employed in the production line at the Mercedes-Benz factory in Vitoria, Spain. A complete analysis of the algorithm performance will be shown here, together with several tests proving the robustness and reliability of our proposal.This work is supported by VALi+d (APOSTD/2016/044) and PROMETEO (PROMETEOII/2014/044) Programs, both from Conselleria d'Educacio, Generalitat Valenciana.Molina, J.; Solanes Galbis, JE.; Arnal-Benedicto, L.; Tornero Montserrat, J. (2017). On the detection of defects on specular car body surfaces. Robotics and Computer-Integrated Manufacturing. 48:263-278. https://doi.org/10.1016/j.rcim.2017.04.009S2632784

    Robust auto tool change for industrial robots using visual servoing

    Full text link
    This is an Author's Accepted Manuscript of an article published in Mu√Īoz-Benavent, Pau, Solanes Galbis, Juan Ernesto, Gracia Calandin, Luis Ignacio, Tornero Montserrat, Josep. (2019). Robust auto tool change for industrial robots using visual servoing.International Journal of Systems Science, 50, 2, 432-449. ¬© Taylor & Francis, available online at: http://doi.org/10.1080/00207721.2018.1562129[EN] This work presents an automated solution for tool changing in industrial robots using visual servoing and sliding mode control. The robustness of the proposed method is due to the control law of the visual servoing, which uses the information acquired by a vision system to close a feedback control loop. Furthermore, sliding mode control is simultaneously used in a prioritised level to satisfy the constraints typically present in a robot system: joint range limits, maximum joint speeds and allowed workspace. Thus, the global control accurately places the tool in the warehouse, but satisfying the robot constraints. The feasibility and effectiveness of the proposed approach is substantiated by simulation results for a complex 3D case study. Moreover, real experimentation with a 6R industrial manipulator is also presented to demonstrate the applicability of the method for tool changing.This work was supported in part by the Ministerio de Economia, Industria y Competitividad, Gobierno de Espana under Grant BES-2010-038486 and Project DPI2017-87656-C2-1-R.Mu√Īoz-Benavent, P.; Solanes Galbis, JE.; Gracia Calandin, LI.; Tornero Montserrat, J. (2019). Robust auto tool change for industrial robots using visual servoing. International Journal of Systems Science. 50(2):432-449. https://doi.org/10.1080/00207721.2018.1562129S43244950

    Robust fulfillment of constraints in robot visual servoing

    Full text link
    [EN] In this work, an approach based on sliding mode ideas is proposed to satisfy constraints in robot visual servoing. In particular, different types of constraints are defined in order to: fulfill the visibility constraints (camera fieldof-view and occlusions) for the image features of the detected object; to avoid exceeding the joint range limits and maximum joint speeds; and to avoid forbidden areas in the robot workspace. Moreover, another task with low-priority is considered to track the target object. The main advantages of the proposed approach are low computational cost, robustness and fully utilization of the allowed space for the constraints. The applicability and effectiveness of the proposed approach is demonstrated by simulation results for a simple 2D case and a complex 3D case study. Furthermore, the feasibility and robustness of the proposed approach is substantiated by experimental results using a conventional 6R industrial manipulator.This work was supported in part by the Spanish Government under grants BES-2010-038486 and Project DPI2013-42302-R, and the Generalitat Valenciana under grants VALi+d APOSTD/2016/044 and BEST/2017/029.Mu√Īoz-Benavent, P.; Gracia Calandin, LI.; Solanes Galbis, JE.; Esparza Peidro, A.; Tornero Montserrat, J. (2018). Robust fulfillment of constraints in robot visual servoing. Control Engineering Practice. 71(1):79-95. https://doi.org/10.1016/j.conengprac.2017.10.017S799571

    Camera 3D positioning mixed reality-based interface to improve worker safety, ergonomics and productivity

    Full text link
    [EN] This research develops a new mixed reality-based worker interface for industrial camera 3D positioning, which is intuitive and easy to manage, in order to enhance the worker safety, ergonomics and productivity. An experimental prototype to be used in the car body quality control is developed in the paper. The benefits and drawbacks of the proposed interface are discussed along the paper and sustained through several usability tests conducted with users familiar and not-familiar with mixed reality devices. Furthermore, the feasibility of the proposed approach is demonstrated by tests made in an industrial environment with skilled workers from Alfatec Sistemas company.This work was supported in part by the Spanish Government under the Project DPI2017-87656-C2-1-R.Mu√Īoz Garc√≠a, A.; Mart√≠ Test√≥n, A.; Mahiques, X.; Gracia Calandin, LI.; Solanes Galbis, JE.; Tornero Montserrat, J. (2020). Camera 3D positioning mixed reality-based interface to improve worker safety, ergonomics and productivity. CIRP Journal of Manufacturing Science and Technology (Online). 28:24-37. https://doi.org/10.1016/j.cirpj.2020.01.004S24372

    Human-robot collaboration for safe object transportation using force feedback

    Full text link
    [EN] This work presents an approach based on multi-task, non-conventional sliding mode control and admittance control for human-robot collaboration aimed at handling applications using force feedback. The proposed robot controller is based on three tasks with different priority levels in order to cooperatively perform the safe transportation of an object with a human operator. In particular, a high-priority task is developed using non-conventional sliding mode control to guarantee safe reference parameters imposed by the task, e.g., keeping a load at a desired orientation (to prevent spill out in the case of liquids, or to reduce undue stresses that may compromise fragile items). Moreover, a second task based on a hybrid admittance control algorithm is used for the human operator to guide the robot by means of a force sensor located at the robot tool. Finally, a third low-priority task is considered for redundant robots in order to use the remaining degrees of freedom of the robot to achieve a pre-set secondary goal (e.g., singularity avoidance, remaining close to a homing configuration for increased safety, etc.) by means of the gradient projection method. The main advantages of the proposed method are robustness and low computational cost. The applicability and effectiveness of the proposed approach are substantiated by experimental results using a redundant 7R manipulator: the Sawyer collaborative robot. (C) 2018 Elsevier B.V. All rights reserved.This work was supported in part by the Spanish Government under Project DPI2017-87656-C2-1-R, and the Generalitat Valenciana under Grants VALi+d APOSTD/2016/044 and BEST/2017/029.Solanes Galbis, JE.; Gracia Calandin, LI.; Mu√Īoz-Benavent, P.; Valls Miro, J.; Carmichael, MG.; Tornero Montserrat, J. (2018). Human-robot collaboration for safe object transportation using force feedback. Robotics and Autonomous Systems. 107:196-208. https://doi.org/10.1016/j.robot.2018.06.003S19620810
    corecore