11,134 research outputs found

    Extrinisic Calibration of a Camera-Arm System Through Rotation Identification

    Get PDF
    Determining extrinsic calibration parameters is a necessity in any robotic system composed of actuators and cameras. Once a system is outside the lab environment, parameters must be determined without relying on outside artifacts such as calibration targets. We propose a method that relies on structured motion of an observed arm to recover extrinsic calibration parameters. Our method combines known arm kinematics with observations of conics in the image plane to calculate maximum-likelihood estimates for calibration extrinsics. This method is validated in simulation and tested against a real-world model, yielding results consistent with ruler-based estimates. Our method shows promise for estimating the pose of a camera relative to an articulated arm's end effector without requiring tedious measurements or external artifacts. Index Terms: robotics, hand-eye problem, self-calibration, structure from motio

    Hand-eye calibration, constraints and source synchronisation for robotic-assisted minimally invasive surgery

    Get PDF
    In robotic-assisted minimally invasive surgery (RMIS), the robotic system allows surgeons to remotely control articulated instruments to perform surgical interventions and introduces a potential to implement computer-assisted interventions (CAI). However, the information in the camera must be correctly transformed into the robot coordinate as its movement is controlled by the robot kinematic. Therefore, determining the rigid transformation connecting the coordinates is necessary. Such process is called hand-eye calibration. One of the challenges in solving the hand-eye problem in the RMIS setup is data asynchronicity, which occurs when tracking equipments are integrated into a robotic system and create temporal misalignment. For the calibration itself, noise in the robot and camera motions can be propagated to the calibrated result and as a result of a limited motion range, the error cannot be fully suppressed. Finally, the calibration procedure must be adaptive and simple so a disruption in a surgical workflow is minimal since any change in the setup may require another calibration procedure. We propose solutions to deal with the asynchronicity, noise sensitivity, and a limited motion range. We also propose a potential to use a surgical instrument as the calibration target to reduce the complexity in the calibration procedure. The proposed algorithms are validated through extensive experiments with synthetic and real data from the da Vinci Research Kit and the KUKA robot arms. The calibration performance is compared with existing hand-eye algorithms and it shows promising results. Although the calibration using a surgical instrument as the calibration target still requires a further development, results indicate that the proposed methods increase the calibration performance, and contribute to finding an optimal solution to the hand-eye problem in robotic surgery

    Real-time 3D Tracking of Articulated Tools for Robotic Surgery

    Full text link
    In robotic surgery, tool tracking is important for providing safe tool-tissue interaction and facilitating surgical skills assessment. Despite recent advances in tool tracking, existing approaches are faced with major difficulties in real-time tracking of articulated tools. Most algorithms are tailored for offline processing with pre-recorded videos. In this paper, we propose a real-time 3D tracking method for articulated tools in robotic surgery. The proposed method is based on the CAD model of the tools as well as robot kinematics to generate online part-based templates for efficient 2D matching and 3D pose estimation. A robust verification approach is incorporated to reject outliers in 2D detections, which is then followed by fusing inliers with robot kinematic readings for 3D pose estimation of the tool. The proposed method has been validated with phantom data, as well as ex vivo and in vivo experiments. The results derived clearly demonstrate the performance advantage of the proposed method when compared to the state-of-the-art.Comment: This paper was presented in MICCAI 2016 conference, and a DOI was linked to the publisher's versio

    Human Perambulation as a Self Calibrating Biometric

    No full text
    This paper introduces a novel method of single camera gait reconstruction which is independent of the walking direction and of the camera parameters. Recognizing people by gait has unique advantages with respect to other biometric techniques: the identification of the walking subject is completely unobtrusive and the identification can be achieved at distance. Recently much research has been conducted into the recognition of frontoparallel gait. The proposed method relies on the very nature of walking to achieve the independence from walking direction. Three major assumptions have been done: human gait is cyclic; the distances between the bone joints are invariant during the execution of the movement; and the articulated leg motion is approximately planar, since almost all of the perceived motion is contained within a single limb swing plane. The method has been tested on several subjects walking freely along six different directions in a small enclosed area. The results show that recognition can be achieved without calibration and without dependence on view direction. The obtained results are particularly encouraging for future system development and for its application in real surveillance scenarios

    A Factor Graph Approach to Multi-Camera Extrinsic Calibration on Legged Robots

    Full text link
    Legged robots are becoming popular not only in research, but also in industry, where they can demonstrate their superiority over wheeled machines in a variety of applications. Either when acting as mobile manipulators or just as all-terrain ground vehicles, these machines need to precisely track the desired base and end-effector trajectories, perform Simultaneous Localization and Mapping (SLAM), and move in challenging environments, all while keeping balance. A crucial aspect for these tasks is that all onboard sensors must be properly calibrated and synchronized to provide consistent signals for all the software modules they feed. In this paper, we focus on the problem of calibrating the relative pose between a set of cameras and the base link of a quadruped robot. This pose is fundamental to successfully perform sensor fusion, state estimation, mapping, and any other task requiring visual feedback. To solve this problem, we propose an approach based on factor graphs that jointly optimizes the mutual position of the cameras and the robot base using kinematics and fiducial markers. We also quantitatively compare its performance with other state-of-the-art methods on the hydraulic quadruped robot HyQ. The proposed approach is simple, modular, and independent from external devices other than the fiducial marker.Comment: To appear on "The Third IEEE International Conference on Robotic Computing (IEEE IRC 2019)

    Markerless View Independent Gait Analysis with Self-camera Calibration

    No full text
    We present a new method for viewpoint independent markerless gait analysis. The system uses a single camera, does not require camera calibration and works with a wide range of directions of walking. These properties make the proposed method particularly suitable for identification by gait, where the advantages of completely unobtrusiveness, remoteness and covertness of the biometric system preclude the availability of camera information and use of marker based technology. Tests on more than 200 video sequences with subjects walking freely along different walking directions have been performed. The obtained results show that markerless gait analysis can be achieved without any knowledge of internal or external camera parameters and that the obtained data that can be used for gait biometrics purposes. The performance of the proposed method is particularly encouraging for its appliance in surveillance scenarios
    • …
    corecore