202 research outputs found

    A Comparative Review of Hand-Eye Calibration Techniques for Vision Guided Robots

    Get PDF
    Hand-eye calibration enables proper perception of the environment in which a vision guided robot operates. Additionally, it enables the mapping of the scene in the robots frame. Proper hand-eye calibration is crucial when sub-millimetre perceptual accuracy is needed. For example, in robot assisted surgery, a poorly calibrated robot would cause damage to surrounding vital tissues and organs, endangering the life of a patient. A lot of research has gone into ways of accurately calibrating the hand-eye system of a robot with different levels of success, challenges, resource requirements and complexities. As such, academics and industrial practitioners are faced with the challenge of choosing which algorithm meets the implementation requirements based on the identified constraints. This review aims to give a general overview of the strengths and weaknesses of different hand-eye calibration algorithms available to academics and industrial practitioners to make an informed design decision, as well as incite possible areas of research based on the identified challenges. We also discuss different calibration targets which is an important part of the calibration process that is often overlooked in the design process

    Hand-eye calibration with a remote centre of motion

    Get PDF
    In the eye-in-hand robot configuration, hand-eye calibration plays a vital role in completing the link between the robot and camera coordinate systems. Calibration algorithms are mature and provide accurate transformation estimations for an effective camera-robot link but rely on a sufficiently wide range of calibration data to avoid errors and degenerate configurations. This can be difficult in the context of keyhole surgical robots because they are mechanically constrained to move around a remote centre of motion (RCM) which is located at the trocar port. The trocar limits the range of feasible calibration poses that can be obtained and results in ill-conditioned hand-eye constraints. In this letter, we propose a new approach to deal with this problem by incorporating the RCM constraints into the hand-eye formulation. We show that this not only avoids ill-conditioned constraints but is also more accurate than classic hand-eye calibration with a free 6DoF motion, due to solving simpler equations that take advantage of the reduced DoF. We validate our method using simulation to test numerical stability and a physical implementation on an RCM constrained KUKA LBR iiwa 14 R820 equipped with a NanEye stereo camera

    Extrinsic Calibration and Ego-Motion Estimation for Mobile Multi-Sensor Systems

    Get PDF
    Autonomous robots and vehicles are often equipped with multiple sensors to perform vital tasks such as localization or mapping. The joint system of various sensors with different sensing modalities can often provide better localization or mapping results than individual sensor alone in terms of accuracy or completeness. However, to enable improved performance, two important challenges have to be addressed when dealing with multi-sensor systems. Firstly, how to accurately determine the spatial relationship between individual sensor on the robot? This is a vital task known as extrinsic calibration. Without this calibration information, measurements from different sensors cannot be fused. Secondly, how to combine data from multiple sensors to correct for the deficiencies of each sensor, and thus, provides better estimations? This is another important task known as data fusion. The core of this thesis is to provide answers to these two questions. We cover, in the first part of the thesis, aspects related to improving the extrinsic calibration accuracy, and present, in the second part, novel data fusion algorithms designed to address the ego-motion estimation problem using data from a laser scanner and a monocular camera. In the extrinsic calibration part, we contribute by revealing and quantifying the relative calibration accuracies of three common types of calibration methods, so as to offer an insight into choosing the best calibration method when multiple options are available. Following that, we propose an optimization approach for solving common motion-based calibration problems. By exploiting the Gauss-Helmert model, our approach is more accurate and robust than classical least squares model. In the data fusion part, we focus on camera-laser data fusion and contribute with two new ego-motion estimation algorithms that combine complementary information from a laser scanner and a monocular camera. The first algorithm utilizes camera image information to guide the laser scan-matching. It can provide accurate motion estimates and yet can work in general conditions without requiring a field-of-view overlap between the camera and laser scanner, nor an initial guess of the motion parameters. The second algorithm combines the camera and the laser scanner information in a direct way, assuming the field-of-view overlap between the sensors is substantial. By maximizing the information usage of both the sparse laser point cloud and the dense image, the second algorithm is able to achieve state-of-the-art estimation accuracy. Experimental results confirm that both algorithms offer excellent alternatives to state-of-the-art camera-laser ego-motion estimation algorithms

    Hand-eye calibration, constraints and source synchronisation for robotic-assisted minimally invasive surgery

    Get PDF
    In robotic-assisted minimally invasive surgery (RMIS), the robotic system allows surgeons to remotely control articulated instruments to perform surgical interventions and introduces a potential to implement computer-assisted interventions (CAI). However, the information in the camera must be correctly transformed into the robot coordinate as its movement is controlled by the robot kinematic. Therefore, determining the rigid transformation connecting the coordinates is necessary. Such process is called hand-eye calibration. One of the challenges in solving the hand-eye problem in the RMIS setup is data asynchronicity, which occurs when tracking equipments are integrated into a robotic system and create temporal misalignment. For the calibration itself, noise in the robot and camera motions can be propagated to the calibrated result and as a result of a limited motion range, the error cannot be fully suppressed. Finally, the calibration procedure must be adaptive and simple so a disruption in a surgical workflow is minimal since any change in the setup may require another calibration procedure. We propose solutions to deal with the asynchronicity, noise sensitivity, and a limited motion range. We also propose a potential to use a surgical instrument as the calibration target to reduce the complexity in the calibration procedure. The proposed algorithms are validated through extensive experiments with synthetic and real data from the da Vinci Research Kit and the KUKA robot arms. The calibration performance is compared with existing hand-eye algorithms and it shows promising results. Although the calibration using a surgical instrument as the calibration target still requires a further development, results indicate that the proposed methods increase the calibration performance, and contribute to finding an optimal solution to the hand-eye problem in robotic surgery

    Probabilistic Calibration and Catheter Tracking with Robotic Systems

    Get PDF
    A significant boost in robotics technology has been observed in recent years and more and more tasks are being automated by robots such as robotic surgery, autonomous driving, package delivery, etc. Not only has the precision of robots been improved, but the number of robots involved in a specific task has also grown in many scenarios. An important part in a robotic automated task involves the relative pose estimation among objects, and this often boils down to calibration and tracking. The dissertation begins with a robotic catheter tracking system and then focuses on calibration of robotic systems. The presentation first introduces a novel robotic catheter tracking system which uses an embedded active piezoelectric element at the tip of the catheter. Catheter intervention procedure is performed exclusively with X-ray, while ultrasound comes as an alternative modality which is radiation free. However, the catheter tip is usually very small and hard to be differentiated from human tissue in an ultrasound image. Moreover, an ultrasound photographer needs to hold the ultrasound probe during the procedure which can easily last for over an hour. The proposed system can tackle these issues using a robot arm and the active echo signal, and is, to the best knowledge of the author, the first robotic catheter tracking system using ultrasound. It is demonstrated in both the simulation and experiment that a robotic arm holding the ultrasound probe can track the catheter tip without image input. To better assist the tracking process, other procedures can be automated such as catheter insertion and phantom localization, etc. All these require introducing an extra robot and a precise calibration between robots and targets of interest. Out of many calibration approaches, the most classical one is called the hand-eye calibration problem formulated as AX = XB which takes in data from sensors in different locations to solve for an unknown rigid-body transformation. A generalization of this problem is the AX = YB robot-world and hand-eye calibration, where two unknowns need to be recovered simultaneously. The above two approaches mainly deal with the calibration of a single robot system. For multi-robot systems, a problem cast as the AXB = YCZ formulation arises where three unknowns need to be solved given three sensor data streams. The second portion of the presentation investigates in the probabilistic approaches toward all three problems above. Different methods based on the probabilistic theory on Lie group are developed to show their superior performance over non-probabilistic equivalents when there is partial knowledge of the correspondence among sensor data

    Calibration of spatial relationships between multiple robots and sensors

    Get PDF
    Classic hand-eye calibration methods have been limited to single robots and sensors. Recently a new calibration formulation for multiple robots has been proposed that solves for the extrinsic calibration parameters for each robot simultaneously instead of sequentially. The existing solutions for this new problem required data to have correspondence, but Ma, Goh and Chirikjian (MGC) proposed a probabilistic method to solve this problem which eliminated the need for correspondence. In this thesis, the literature of the various robot-sensor calibration problems and solutions are surveyed, and the MGC method is reviewed in detail. Lastly comparison with other methods using numerical simulations were carried out to draw some conclusions

    Quantization, Calibration and Planning for Euclidean Motions in Robotic Systems

    Get PDF
    The properties of Euclidean motions are fundamental in all areas of robotics research. Throughout the past several decades, investigations on some low-level tasks like parameterizing specific movements and generating effective motion plans have fostered high-level operations in an autonomous robotic system. In typical applications, before executing robot motions, a proper quantization of basic motion primitives could simplify online computations; a precise calibration of sensor readings could elevate the accuracy of the system controls. Of particular importance in the whole autonomous robotic task, a safe and efficient motion planning framework would make the whole system operate in a well-organized and effective way. All these modules encourage huge amounts of efforts in solving various fundamental problems, such as the uniformity of quantization in non-Euclidean manifolds, the calibration errors on unknown rigid transformations due to the lack of data correspondence and noise, the narrow passage and the curse of dimensionality bottlenecks in developing motion planning algorithms, etc. Therefore, the goal of this dissertation is to tackle these challenges in the topics of quantization, calibration and planning for Euclidean motions
    • …
    corecore