4,149 research outputs found

    Model-based vs. model-free visual servoing: A Performance evaluation in microsystems

    Get PDF
    In this paper, model-based and model-free image based visual servoing (VS) approaches are implemented on a microassembly workstation, and their regulation and tracking performances are evaluated. A precise image based VS relies on computation of the image jacobian. In the model-based visual servoing, the image Jacobian is computed via calibrating the optical system. Precisely calibrated model based VS promises better positioning and tracking performance than the model-free approach. However, in the model-free approach, optical system calibration is not required due to the dynamic Jacobian estimation, thus it has the advantage of adapting to the different operating modes

    Unfalsified visual servoing for simultaneous object recognition and pose tracking

    Get PDF
    In a complex environment, simultaneous object recognition and tracking has been one of the challenging topics in computer vision and robotics. Current approaches are usually fragile due to spurious feature matching and local convergence for pose determination. Once a failure happens, these approaches lack a mechanism to recover automatically. In this paper, data-driven unfalsified control is proposed for solving this problem in visual servoing. It recognizes a target through matching image features with a 3-D model and then tracks them through dynamic visual servoing. The features can be falsified or unfalsified by a supervisory mechanism according to their tracking performance. Supervisory visual servoing is repeated until a consensus between the model and the selected features is reached, so that model recognition and object tracking are accomplished. Experiments show the effectiveness and robustness of the proposed algorithm to deal with matching and tracking failures caused by various disturbances, such as fast motion, occlusions, and illumination variation

    Visual Servoing using the Sum of Conditional Variance

    Get PDF
    International audienceIn this paper we propose a new way to achieve direct visual servoing. The novelty is the use of the sum of conditional variance to realize the optimization process of a positioning task. This measure, which has previously been used successfully in the case of visual tracking, has been shown to be invariant to non-linear illumination variations and inexpensive to compute. Compared to other direct approaches of visual servoing, it is a good compromise between techniques using the illumination of pixels which are computationally inexpensive but non robust to illumination variations and other approaches using the mutual information which are more complicated to compute but offer more robustness towards the variations of the scene. This method results in a direct visual servoing task easy and fast to compute and robust towards non-linear illumination variations. This paper describes a visual servoing task based on the sum of conditional variance performed using a Levenberg-Marquardt optimization process. The results are then demonstrated through experimental validations and compared to both photometric-based and entropy-based techniques

    Sliding mode control for robust and smooth reference tracking in robot visual servoing

    Full text link
    [EN] An approach based on sliding mode is proposed in this work for reference tracking in robot visual servoing. In particular, 2 sliding mode controls are obtained depending on whether joint accelerations or joint jerks are considered as the discontinuous control action. Both sliding mode controls are extensively compared in a 3D-simulated environment with their equivalent well-known continuous controls, which can be found in the literature, to highlight their similarities and differences. The main advantages of the proposed method are smoothness, robustness, and low computational cost. The applicability and robustness of the proposed approach are substantiated by experimental results using a conventional 6R industrial manipulator (KUKA KR 6 R900 sixx [AGILUS]) for positioning and tracking tasks.Spanish Government, Grant/Award Number: BES-2010-038486; Generalitat Valenciana, Grant/Award Number: BEST/2017/029 and APOSTD/2016/044Muñoz-Benavent, P.; Gracia, L.; Solanes, JE.; Esparza, A.; Tornero, J. (2018). Sliding mode control for robust and smooth reference tracking in robot visual servoing. International Journal of Robust and Nonlinear Control. 28(5):1728-1756. https://doi.org/10.1002/rnc.3981S17281756285Hutchinson, S., Hager, G. D., & Corke, P. I. (1996). A tutorial on visual servo control. IEEE Transactions on Robotics and Automation, 12(5), 651-670. doi:10.1109/70.538972Chaumette, F., & Hutchinson, S. (2008). Visual Servoing and Visual Tracking. Springer Handbook of Robotics, 563-583. doi:10.1007/978-3-540-30301-5_25Corke, P. (2011). Robotics, Vision and Control. Springer Tracts in Advanced Robotics. doi:10.1007/978-3-642-20144-8RYAN, E. P., & CORLESS, M. (1984). Ultimate Boundedness and Asymptotic Stability of a Class of Uncertain Dynamical Systems via Continuous and Discontinuous Feedback Control. IMA Journal of Mathematical Control and Information, 1(3), 223-242. doi:10.1093/imamci/1.3.223Chaumette, F., & Hutchinson, S. (2006). Visual servo control. I. Basic approaches. IEEE Robotics & Automation Magazine, 13(4), 82-90. doi:10.1109/mra.2006.250573Chaumette, F., & Hutchinson, S. (2007). Visual servo control. II. Advanced approaches [Tutorial]. IEEE Robotics & Automation Magazine, 14(1), 109-118. doi:10.1109/mra.2007.339609Bonfe M Mainardi E Fantuzzi C Variable structure PID based visual servoing for robotic tracking and manipulation 2002 Lausanne, Switzerland https://doi.org/10.1109/IRDS.2002.1041421Solanes, J. E., Muñoz-Benavent, P., Girbés, V., Armesto, L., & Tornero, J. (2015). On improving robot image-based visual servoing based on dual-rate reference filtering control strategy. Robotica, 34(12), 2842-2859. doi:10.1017/s0263574715000454Elena M Cristiano M Damiano F Bonfe M Variable structure PID controller for cooperative eye-in-hand/eye-to-hand visual servoing 2003 Istanbul, Turkey https://doi.org/10.1109/CCA.2003.1223145Hashimoto, K., Ebine, T., & Kimura, H. (1996). Visual servoing with hand-eye manipulator-optimal control approach. IEEE Transactions on Robotics and Automation, 12(5), 766-774. doi:10.1109/70.538981Chan A Leonard S Croft EA Little JJ Collision-free visual servoing of an eye-in-hand manipulator via constraint-aware planning and control 2011 San Francisco, CA, USA https://doi.org/10.1109/ACC.2011.5991008Allibert, G., Courtial, E., & Chaumette, F. (2010). Visual Servoing via Nonlinear Predictive Control. Lecture Notes in Control and Information Sciences, 375-393. doi:10.1007/978-1-84996-089-2_20Kragic, D., & Christensen, H. I. (2003). Robust Visual Servoing. The International Journal of Robotics Research, 22(10-11), 923-939. doi:10.1177/027836490302210009Mezouar Y Chaumette F Path planning in image space for robust visual servoing 2000 San Francisco, CA, USA https://doi.org/10.1109/ROBOT.2000.846445Morel, G., Zanne, P., & Plestan, F. (2005). Robust visual servoing: bounding the task function tracking errors. IEEE Transactions on Control Systems Technology, 13(6), 998-1009. doi:10.1109/tcst.2005.857409Hammouda, L., Kaaniche, K., Mekki, H., & Chtourou, M. (2015). Robust visual servoing using global features based on random process. International Journal of Computational Vision and Robotics, 5(2), 138. doi:10.1504/ijcvr.2015.068803Yang YX Liu D Liu H Robot-self-learning visual servoing algorithm using neural networks 2002 Beijing, China https://doi.org/10.1109/ICMLC.2002.1174473Sadeghzadeh, M., Calvert, D., & Abdullah, H. A. (2014). Self-Learning Visual Servoing of Robot Manipulator Using Explanation-Based Fuzzy Neural Networks and Q-Learning. Journal of Intelligent & Robotic Systems, 78(1), 83-104. doi:10.1007/s10846-014-0151-5Lee AX Levine S Abbeel P Learning Visual Servoing With Deep Features and Fitted Q-Iteration 2017Fakhry, H. H., & Wilson, W. J. (1996). A modified resolved acceleration controller for position-based visual servoing. Mathematical and Computer Modelling, 24(5-6), 1-9. doi:10.1016/0895-7177(96)00112-4Keshmiri, M., Wen-Fang Xie, & Mohebbi, A. (2014). Augmented Image-Based Visual Servoing of a Manipulator Using Acceleration Command. IEEE Transactions on Industrial Electronics, 61(10), 5444-5452. doi:10.1109/tie.2014.2300048Edwards, C., & Spurgeon, S. (1998). Sliding Mode Control. doi:10.1201/9781498701822Zanne P Morel G Piestan F Robust vision based 3D trajectory tracking using sliding mode control 2000 San Francisco, CA, USAOliveira TR Peixoto AJ Leite AC Hsu L Sliding mode control of uncertain multivariable nonlinear systems applied to uncalibrated robotics visual servoing 2009 St. Louis, MO, USAOliveira, T. R., Leite, A. C., Peixoto, A. J., & Hsu, L. (2014). Overcoming Limitations of Uncalibrated Robotics Visual Servoing by means of Sliding Mode Control and Switching Monitoring Scheme. Asian Journal of Control, 16(3), 752-764. doi:10.1002/asjc.899Li, F., & Xie, H.-L. (2010). Sliding mode variable structure control for visual servoing system. International Journal of Automation and Computing, 7(3), 317-323. doi:10.1007/s11633-010-0509-5Kim J Kim D Choi S Won S Image-based visual servoing using sliding mode control 2006 Busan, South KoreaBurger W Dean-Leon E Cheng G Robust second order sliding mode control for 6D position based visual servoing with a redundant mobile manipulator 2015 Seoul, South KoreaBecerra, H. M., López-Nicolás, G., & Sagüés, C. (2011). A Sliding-Mode-Control Law for Mobile Robots Based on Epipolar Visual Servoing From Three Views. IEEE Transactions on Robotics, 27(1), 175-183. doi:10.1109/tro.2010.2091750Parsapour, M., & Taghirad, H. D. (2015). Kernel-based sliding mode control for visual servoing system. IET Computer Vision, 9(3), 309-320. doi:10.1049/iet-cvi.2013.0310Xin J Ran BJ Ma XM Robot visual sliding mode servoing using SIFT features 2016 Chengdu, ChinaZhao, Y. M., Lin, Y., Xi, F., Guo, S., & Ouyang, P. (2016). Switch-Based Sliding Mode Control for Position-Based Visual Servoing of Robotic Riveting System. Journal of Manufacturing Science and Engineering, 139(4). doi:10.1115/1.4034681Moosavian, S. A. A., & Papadopoulos, E. (2007). Modified transpose Jacobian control of robotic systems. Automatica, 43(7), 1226-1233. doi:10.1016/j.automatica.2006.12.029Sagara, S., & Taira, Y. (2008). Digital control of space robot manipulators with velocity type joint controller using transpose of generalized Jacobian matrix. Artificial Life and Robotics, 13(1), 355-358. doi:10.1007/s10015-008-0584-7Khalaji, A. K., & Moosavian, S. A. A. (2015). Modified transpose Jacobian control of a tractor-trailer wheeled robot. Journal of Mechanical Science and Technology, 29(9), 3961-3969. doi:10.1007/s12206-015-0841-3Utkin, V., Guldner, J., & Shi, J. (2017). Sliding Mode Control in Electro-Mechanical Systems. doi:10.1201/9781420065619Utkin, V. (2016). Discussion Aspects of High-Order Sliding Mode Control. IEEE Transactions on Automatic Control, 61(3), 829-833. doi:10.1109/tac.2015.2450571Romdhane, H., Dehri, K., & Nouri, A. S. (2016). Discrete second-order sliding mode control based on optimal sliding function vector for multivariable systems with input-output representation. International Journal of Robust and Nonlinear Control, 26(17), 3806-3830. doi:10.1002/rnc.3536Sharma, N. K., & Janardhanan, S. (2017). Optimal discrete higher-order sliding mode control of uncertain LTI systems with partial state information. International Journal of Robust and Nonlinear Control. doi:10.1002/rnc.3785LEVANT, A. (1993). Sliding order and sliding accuracy in sliding mode control. International Journal of Control, 58(6), 1247-1263. doi:10.1080/00207179308923053Levant, A. (2003). Higher-order sliding modes, differentiation and output-feedback control. International Journal of Control, 76(9-10), 924-941. doi:10.1080/0020717031000099029Bartolini, G., Ferrara, A., & Usai, E. (1998). Chattering avoidance by second-order sliding mode control. IEEE Transactions on Automatic Control, 43(2), 241-246. doi:10.1109/9.661074Siciliano, B., Sciavicco, L., Villani, L., & Oriolo, G. (2009). Robotics. Advanced Textbooks in Control and Signal Processing. doi:10.1007/978-1-84628-642-1Deo, A. S., & Walker, I. D. (1995). Overview of damped least-squares methods for inverse kinematics of robot manipulators. Journal of Intelligent & Robotic Systems, 14(1), 43-68. doi:10.1007/bf01254007WHEELER, G., SU, C.-Y., & STEPANENKO, Y. (1998). A Sliding Mode Controller with Improved Adaptation Laws for the Upper Bounds on the Norm of Uncertainties. Automatica, 34(12), 1657-1661. doi:10.1016/s0005-1098(98)80024-1Yu-Sheng Lu. (2009). Sliding-Mode Disturbance Observer With Switching-Gain Adaptation and Its Application to Optical Disk Drives. IEEE Transactions on Industrial Electronics, 56(9), 3743-3750. doi:10.1109/tie.2009.2025719Chen, X., Shen, W., Cao, Z., & Kapoor, A. (2014). A novel approach for state of charge estimation based on adaptive switching gain sliding mode observer in electric vehicles. Journal of Power Sources, 246, 667-678. doi:10.1016/j.jpowsour.2013.08.039Cong, B. L., Chen, Z., & Liu, X. D. (2012). On adaptive sliding mode control without switching gain overestimation. International Journal of Robust and Nonlinear Control, 24(3), 515-531. doi:10.1002/rnc.2902Taleb, M., Plestan, F., & Bououlid, B. (2014). An adaptive solution for robust control based on integral high-order sliding mode concept. International Journal of Robust and Nonlinear Control, 25(8), 1201-1213. doi:10.1002/rnc.3135Zhu, J., & Khayati, K. (2016). On a new adaptive sliding mode control for MIMO nonlinear systems with uncertainties of unknown bounds. International Journal of Robust and Nonlinear Control, 27(6), 942-962. doi:10.1002/rnc.3608Hafez AHA Cervera E Jawahar CV Hybrid visual servoing by boosting IBVS and PBVS 2008 Damascus, SyriaKermorgant O Chaumette F Combining IBVS and PBVS to ensure the visibility constraint 2011 San Francisco, CA, USACorke, P. I., & Hutchinson, S. A. (2001). A new partitioned approach to image-based visual servo control. IEEE Transactions on Robotics and Automation, 17(4), 507-515. doi:10.1109/70.954764Yang, Z., & Shen, S. (2017). Monocular Visual–Inertial State Estimation With Online Initialization and Camera–IMU Extrinsic Calibration. IEEE Transactions on Automation Science and Engineering, 14(1), 39-51. doi:10.1109/tase.2016.2550621Chesi G Hashimoto K Static-eye against hand-eye visual servoing 2002 Las Vegas, NV, USABourdis N Marraud D Sahbi H Camera pose estimation using visual servoing for aerial video change detection 2012 Munich, GermanyShademan A Janabi-Sharifi F Sensitivity analysis of EKF and iterated EKF pose estimation for position-based visual servoing 2005 USAMalis, E., Mezouar, Y., & Rives, P. (2010). Robustness of Image-Based Visual Servoing With a Calibrated Camera in the Presence of Uncertainties in the Three-Dimensional Structure. IEEE Transactions on Robotics, 26(1), 112-120. doi:10.1109/tro.2009.2033332Chen J Behal A Dawson D Dixon W Adaptive visual servoing in the presence of intrinsic calibration uncertainty 2003 USAMezouar Y Malis E Robustness of central catadioptric image-based visual servoing to uncertainties on 3D parameters 2004 Sendai, JapanMarchand, E., Spindler, F., & Chaumette, F. (2005). ViSP for visual servoing: a generic software platform with a wide class of robot control skills. IEEE Robotics & Automation Magazine, 12(4), 40-52. doi:10.1109/mra.2005.157702

    Visual Servoing For Robotic Positioning And Tracking Systems

    Get PDF
    Visual servoing is a robot control method in which camera sensors are used inside the control loop and visual feedback is introduced into the robot control loop to enhance the robot control performance in accomplishing tasks in unstructured environments. In general, visual servoing can be categorized into image-based visual servoing (IBVS), position-based visual servoing (PBVS), and hybrid approach. To improve the performance and robustness of visual servoing systems, the research on IBVS for robotic positioning and tracking systems mainly focuses on aspects of camera configuration, image features, pose estimation, and depth determination. In the first part of this research, two novel multiple camera configurations of visual servoing systems are proposed for robotic manufacturing systems for positioning large-scale workpieces. The main advantage of these two multiple camera configurations is that the depths of target objects or target features are constant or can be determined precisely by using computer vision. Hence the accuracy of the interaction matrix is guaranteed, and thus the positioning performances of visual servoing systems can be improved remarkably. The simulation results show that the proposed multiple camera configurations of visual servoing for large-scale manufacturing systems can satisfy the demand of high-precision positioning and assembly in the aerospace industry. In the second part of this research, two improved image features for planar central symmetrical-shaped objects are proposed based on image moment invariants, which can represent the pose of target objects with respect to camera frame. A visual servoing controller based on the proposed image moment features is designed and thus the control performance of the robotic tracking system is improved compared with the method based on the commonly used image moment features. Experimental results on a 6-DOF robot visual servoing system demonstrate the efficiency of the proposed method. Lastly, to address the challenge of choosing proper image features for planar objects to get maximal decoupled structure of the interaction matrix, the neural network (NN) is applied as the estimator of target object poses with respect to camera frame based on the image moment invariants. Compared with previous methods, this scheme avoids image interaction matrix singularity and image local minima in IBVS. Furthermore, the analytical form of depth computation is given by using classical geometrical primitives and image moment invariants. A visual servoing controller is designed and the tracking performance is enhanced for robotic tracking systems. Experimental results on a 6-DOF robot system are provided to illustrate the effectiveness of the proposed scheme

    Visual Servoing in Robotics

    Get PDF
    Visual servoing is a well-known approach to guide robots using visual information. Image processing, robotics, and control theory are combined in order to control the motion of a robot depending on the visual information extracted from the images captured by one or several cameras. With respect to vision issues, a number of issues are currently being addressed by ongoing research, such as the use of different types of image features (or different types of cameras such as RGBD cameras), image processing at high velocity, and convergence properties. As shown in this book, the use of new control schemes allows the system to behave more robustly, efficiently, or compliantly, with fewer delays. Related issues such as optimal and robust approaches, direct control, path tracking, or sensor fusion are also addressed. Additionally, we can currently find visual servoing systems being applied in a number of different domains. This book considers various aspects of visual servoing systems, such as the design of new strategies for their application to parallel robots, mobile manipulators, teleoperation, and the application of this type of control system in new areas

    Dynamic visual servoing from sequential regions of interest acquisition.: On behalf of: Multimedia Archives Dynamic visual servoing from sequential regions of interest acquisition.

    Get PDF
    International audienceOne of the main drawbacks of vision-based control that remains unsolved is the poor dynamic performances caused by the low acquisition frequency of the vision systems and the time latency due to processing. We propose in this paper to face the challenge of designing a high-performance dynamic visual servo control scheme. Two versatile control laws are developed in this paper: a position-based dynamic visual servoing and an image-based dynamic visual servoing. Both control laws are designed to compute the control torques exclusively from a sequential acquisition of regions of interest containing the visual features to achieve an accurate trajectory tracking. The presented experiments on vision-based dynamic control of a high-speed parallel robot show that the proposed control schemes can perform better than joint-based computed torque control

    Positioning and trajectory following tasks in microsystems using model free visual servoing

    Get PDF
    In this paper, we explore model free visual servoing algorithms by experimentally evaluating their performances for various tasks performed on a microassembly workstation developed in our lab. Model free or so called uncalibrated visual servoing does not need the system calibration (microscope-camera-micromanipulator) and the model of the observed scene. It is robust to parameter changes and disturbances. We tested its performance in point-to-point positioning and various trajectory following tasks. Experimental results validate the utility of model free visual servoing in microassembly tasks
    • …
    corecore