439 research outputs found

    Enhanced Image-Based Visual Servoing Dealing with Uncertainties

    Get PDF
    Nowadays, the applications of robots in industrial automation have been considerably increased. There is increasing demand for the dexterous and intelligent robots that can work in unstructured environment. Visual servoing has been developed to meet this need by integration of vision sensors into robotic systems. Although there has been significant development in visual servoing, there still exist some challenges in making it fully functional in the industry environment. The nonlinear nature of visual servoing and also system uncertainties are part of the problems affecting the control performance of visual servoing. The projection of 3D image to 2D image which occurs in the camera creates a source of uncertainty in the system. Another source of uncertainty lies in the camera and robot manipulator's parameters. Moreover, limited field of view (FOV) of the camera is another issues influencing the control performance. There are two main types of visual servoing: position-based and image-based. This project aims to develop a series of new methods of image-based visual servoing (IBVS) which can address the nonlinearity and uncertainty issues and improve the visual servoing performance of industrial robots. The first method is an adaptive switch IBVS controller for industrial robots in which the adaptive law deals with the uncertainties of the monocular camera in eye-in-hand configuration. The proposed switch control algorithm decouples the rotational and translational camera motions and decomposes the IBVS control into three separate stages with different gains. This method can increase the system response speed and improve the tracking performance of IBVS while dealing with camera uncertainties. The second method is an image feature reconstruction algorithm based on the Kalman filter which is proposed to handle the situation where the image features go outside the camera's FOV. The combination of the switch controller and the feature reconstruction algorithm can not only improve the system response speed and tracking performance of IBVS, but also can ensure the success of servoing in the case of the feature loss. Next, in order to deal with the external disturbance and uncertainties due to the depth of the features, the third new control method is designed to combine proportional derivative (PD) control with sliding mode control (SMC) on a 6-DOF manipulator. The properly tuned PD controller can ensure the fast tracking performance and SMC can deal with the external disturbance and depth uncertainties. In the last stage of the thesis, the fourth new semi off-line trajectory planning method is developed to perform IBVS tasks for a 6-DOF robotic manipulator system. In this method, the camera's velocity screw is parametrized using time-based profiles. The parameters of the velocity profile are then determined such that the velocity profile takes the robot to its desired position. This is done by minimizing the error between the initial and desired features. The algorithm for planning the orientation of the robot is decoupled from the position planning of the robot. This allows a convex optimization problem which lead to a faster and more efficient algorithm. The merit of the proposed method is that it respects all of the system constraints. This method also considers the limitation caused by camera's FOV. All the developed algorithms in the thesis are validated via tests on a 6-DOF Denso robot in an eye-in-hand configuration

    Image space trajectory tracking of 6-DOF robot manipulator in assisting visual servoing

    Get PDF
    As vision is a versatile sensor, vision-based control of robot is becoming more important in industrial applications. The control signal generated using the traditional control algorithms leads to undesirable movement of the end-effector during the positioning task. This movement may sometimes cause task failure due to visibility loss. In this paper, a sliding mode controller (SMC) is designed to track 2D image features in an image-based visual servoing task. The feature trajectory tracking helps to keep the image features always in the camera field of view and thereby ensures the shortest trajectory of the end-effector. SMC is the right choice to handle the depth uncertainties associated with translational motion. Stability of the closed-loop system with the proposed controller is proved by the Lyapunov method. Three feature trajectories are generated to test the efficacy of the proposed method. Simulation tests are conducted and the superiority of the proposed method over a Proportional Derivative – Sliding Mode Controller (PD-SMC) in terms of settling time and distance travelled by the end-effector is established in the presence and absence of depth uncertainties. The proposed controller is also tested in real-time by integrating the visual servoing system with a 6-DOF industrial robot manipulator, ABB IRB 1200

    Robust fulfillment of constraints in robot visual servoing

    Full text link
    [EN] In this work, an approach based on sliding mode ideas is proposed to satisfy constraints in robot visual servoing. In particular, different types of constraints are defined in order to: fulfill the visibility constraints (camera fieldof-view and occlusions) for the image features of the detected object; to avoid exceeding the joint range limits and maximum joint speeds; and to avoid forbidden areas in the robot workspace. Moreover, another task with low-priority is considered to track the target object. The main advantages of the proposed approach are low computational cost, robustness and fully utilization of the allowed space for the constraints. The applicability and effectiveness of the proposed approach is demonstrated by simulation results for a simple 2D case and a complex 3D case study. Furthermore, the feasibility and robustness of the proposed approach is substantiated by experimental results using a conventional 6R industrial manipulator.This work was supported in part by the Spanish Government under grants BES-2010-038486 and Project DPI2013-42302-R, and the Generalitat Valenciana under grants VALi+d APOSTD/2016/044 and BEST/2017/029.Muñoz-Benavent, P.; Gracia Calandin, LI.; Solanes Galbis, JE.; Esparza Peidro, A.; Tornero Montserrat, J. (2018). Robust fulfillment of constraints in robot visual servoing. Control Engineering Practice. 71(1):79-95. https://doi.org/10.1016/j.conengprac.2017.10.017S799571

    Deep Drone Racing: From Simulation to Reality with Domain Randomization

    Full text link
    Dynamically changing environments, unreliable state estimation, and operation under severe resource constraints are fundamental challenges that limit the deployment of small autonomous drones. We address these challenges in the context of autonomous, vision-based drone racing in dynamic environments. A racing drone must traverse a track with possibly moving gates at high speed. We enable this functionality by combining the performance of a state-of-the-art planning and control system with the perceptual awareness of a convolutional neural network (CNN). The resulting modular system is both platform- and domain-independent: it is trained in simulation and deployed on a physical quadrotor without any fine-tuning. The abundance of simulated data, generated via domain randomization, makes our system robust to changes of illumination and gate appearance. To the best of our knowledge, our approach is the first to demonstrate zero-shot sim-to-real transfer on the task of agile drone flight. We extensively test the precision and robustness of our system, both in simulation and on a physical platform, and show significant improvements over the state of the art.Comment: Accepted as a Regular Paper to the IEEE Transactions on Robotics Journal. arXiv admin note: substantial text overlap with arXiv:1806.0854

    Sliding Mode Control (SMC) of Image‐Based Visual Servoing for a 6DOF Manipulator

    Get PDF
    The accuracy and stability are two fundamental concerns of the visual servoing control system. This chapter presents a sliding mode controller for image‐based visual servoing (IBVS) which can increase the accuracy of 6DOF robotic system with guaranteed stability. The proposed controller combines proportional derivative (PD) control with sliding mode control (SMC) for a 6DOF manipulator. Compared with conventional proportional or SMC controller, this approach owns faster convergence and better disturbance rejection ability. Both simulation and experimental results show that the proposed controller can increase the accuracy and robustness of a 6DOF robotic system
    • 

    corecore