560 research outputs found

    Extrinsic Calibration of a Camera and Laser Range Finder

    Get PDF
    We describes theoretical and experimental results for the extrinsic calibration of sensor platform consisting of a camera and a laser range finder. The proposed technique requires the system to observe a planar pattern in several poses, and the constraints are based upon data captured simultaneously from the camera and the laser range finder. The planar pattern surface and the laser scanline on the planar pattern are related, so these data constrain the relative position and orientation of the camera and laser range finder. The calibration procedure starts with a closed-from solution, which provides initial conditions for a subsequent nonlinear refinement. We present the results from both computer simulated data and an implementation on a B21rT M Mobile Robot from iRobot Corporation, using a Sony firewire digital camera and SICK PLS laser scanner

    Comprehensive Extrinsic Calibration of a Camera and a 2D Laser Scanner for a Ground Vehicle

    Get PDF
    Cameras and laser scanners are two important kinds of perceptive sensors and both become more and more commonly used for intelligent ground vehicles; the calibration of these sensors is a fundamental task. A new method is proposed to perform COMPREHENSIVE extrinsic calibration of a SINGLE camera-2D laser scanner pair, i.e. the process of revealing ALL the spatial relationships among the camera coordinates system, the laser scanner coordinates system, the ground coordinates system, and the vehicle coordinates system. The proposed method is mainly based on the convenient and widely used chessboard calibration practice and can be conveniently implemented. The proposed method has been tested on both synthetic data and real data based experiments, which validate the effectiveness of the proposed method.La caméra et le scanner laser sont deux types importants de capteurs perceptifs et tous les deux deviennent de plus en plus communs pour de nombreuses applications des véhicules intelligents. La calibration de ces capteurs est une tâche fondamentale. Dans ce rapport, on a propose une nouvelle méthode pour réaliser la calibration extrinsèque compréhensive d'une seule paire caméra-scanner laser 2D, à savoir le procédé de révéler tous les relations spatiales parmi un système de coordonnées caméra, un système de coordonnées scanner laser, un système de coordonnées terrestre, et un système de coordonnées véhicule. La méthode proposée se fonde principalement sur la practique de cabliration au damier et est facile à mettre en œuvre. Des tests des données réelles et des données synthétiques ont validé la performance de la méthode proposée

    Reflectance Intensity Assisted Automatic and Accurate Extrinsic Calibration of 3D LiDAR and Panoramic Camera Using a Printed Chessboard

    Full text link
    This paper presents a novel method for fully automatic and convenient extrinsic calibration of a 3D LiDAR and a panoramic camera with a normally printed chessboard. The proposed method is based on the 3D corner estimation of the chessboard from the sparse point cloud generated by one frame scan of the LiDAR. To estimate the corners, we formulate a full-scale model of the chessboard and fit it to the segmented 3D points of the chessboard. The model is fitted by optimizing the cost function under constraints of correlation between the reflectance intensity of laser and the color of the chessboard's patterns. Powell's method is introduced for resolving the discontinuity problem in optimization. The corners of the fitted model are considered as the 3D corners of the chessboard. Once the corners of the chessboard in the 3D point cloud are estimated, the extrinsic calibration of the two sensors is converted to a 3D-2D matching problem. The corresponding 3D-2D points are used to calculate the absolute pose of the two sensors with Unified Perspective-n-Point (UPnP). Further, the calculated parameters are regarded as initial values and are refined using the Levenberg-Marquardt method. The performance of the proposed corner detection method from the 3D point cloud is evaluated using simulations. The results of experiments, conducted on a Velodyne HDL-32e LiDAR and a Ladybug3 camera under the proposed re-projection error metric, qualitatively and quantitatively demonstrate the accuracy and stability of the final extrinsic calibration parameters.Comment: 20 pages, submitted to the journal of Remote Sensin

    Calibration of a Rotating Laser Range Finder using Intensity Features

    Full text link
    © 2018 IEEE. This paper presents an algorithm for calibrating a '3D range sensor' constructed using a two-dimensional laser range finder (LRF), that is rotated about an axis using a motor to obtain a three-dimensional point cloud. The sensor assembly is modelled as a two degree of freedom open kinematic chain, with one joint corresponding to the axis of the internal mirror in the LRF and the other joint set along the axis of the motor used to rotate the body of the LRF. In the application described in this paper, the sensor unit is mounted on a robot arm used for infrastructure inspection. The objective of the calibration process is to obtain the coordinate transform required to compute the locations of the 3D points with respect to the robot coordinate frame. Proposed strategy uses observations of a set of markers arbitrarily placed in the environment. Distances between these markers are measured and a metric multidimensional scaling is used to obtain the coordinates of the markers with respect to a local coordinate frame. Intensity associated with each beam point of a laser scan is used to locate the reflective markers in the 3D point cloud and a least squares problem is formulated to compute the relationship between the robot coordinate frame, LRF coordinate frame and the marker coordinate frame. Results from experiments using the robot, LRF combination to map a cavity inside a steel bridge structure are presented to demonstrate the effectiveness of the calibration process
    corecore