311 research outputs found

    Simple and efficient method for calibration of a camera and 2D laser rangefinder

    Get PDF
    In the last few years, the integration of cameras and laser rangefinders has been applied to a lot of researches on robotics, namely autonomous navigation vehicles, and intelligent transportation systems. The system based on multiple devices usually requires the relative pose of devices for processing. Therefore, the requirement of calibration of a camera and a laser device is very important task. This paper presents a calibration method for determining the relative position and direction of a camera with respect to a laser rangefinder. The calibration method makes use of depth discontinuities of the calibration pattern, which emphasizes the beams of laser to automatically estimate the occurred position of laser scans on the calibration pattern. Laser range scans are also used for estimating corresponding 3D image points in the camera coordinates. Finally, the relative parameters between camera and laser device are discovered by using corresponding 3D points of them.In the last few years, the integration of cameras and laser rangefinders has been applied to a lot of researches on robotics, namely autonomous navigation vehicles, and intelligent transportation systems. The system based on multiple devices usually requires the relative pose of devices for processing. Therefore, the requirement of calibration of a camera and a laser device is very important task. This paper presents a calibration method for determining the relative position and direction of a camera with respect to a laser rangefinder. The calibration method makes use of depth discontinuities of the calibration pattern, which emphasizes the beams of laser to automatically estimate the occurred position of laser scans on the calibration pattern. Laser range scans are also used for estimating corresponding 3D image points in the camera coordinates. Finally, the relative parameters between camera and laser device are discovered by using corresponding 3D points of them

    Mutual information based sensor registration and calibration

    Full text link
    Knowledge of calibration, that defines the location of sensors relative to each other, and registration, that relates sensor response due to the same physical phenomena, are essential in order to be able to fuse information from multiple sensors. In this paper, a Mutual Information (MI) based approach for automatic sensor registration and calibration is presented. Unsupervised learning of a nonparametric sensing model by maximizing mutual information between signal streams is used to relate information from different sensors, allowing unknown sensor registration and calibration to be determined. Experiments conducted in an office environment are used to illustrate the effectiveness of the proposed technique. Two laser sensors are used to capture people mobbing in an arbitrarily manner in the environment and MI from a number of attributes of the motion are used for relating the signal streams from the sensors. Thus the sensor registration and calibration is achieved without using artificial patterns or pre-specified motions. © 2006 IEEE

    Continuous Online Extrinsic Calibration of Fisheye Camera and LiDAR

    Full text link
    Automated driving systems use multi-modal sensor suites to ensure the reliable, redundant and robust perception of the operating domain, for example camera and LiDAR. An accurate extrinsic calibration is required to fuse the camera and LiDAR data into a common spatial reference frame required by high-level perception functions. Over the life of the vehicle the value of the extrinsic calibration can change due physical disturbances, introducing an error into the high-level perception functions. Therefore there is a need for continuous online extrinsic calibration algorithms which can automatically update the value of the camera-LiDAR calibration during the life of the vehicle using only sensor data. We propose using mutual information between the camera image's depth estimate, provided by commonly available monocular depth estimation networks, and the LiDAR pointcloud's geometric distance as a optimization metric for extrinsic calibration. Our method requires no calibration target, no ground truth training data and no expensive offline optimization. We demonstrate our algorithm's accuracy, precision, speed and self-diagnosis capability on the KITTI-360 data set.Comment: 4 page

    Millimeter-Precision Laser Rangefinder Using a Low-Cost Photon Counter

    Get PDF
    In this book we successfully demonstrate a millimeter-precision laser rangefinder using a low-cost photon counter. An application-specific integrated circuit (ASIC) comprises timing circuitry and single-photon avalanche diodes (SPADs) as the photodetectors. For the timing circuitry, a novel binning architecture for sampling the received signal is proposed which mitigates non-idealities that are inherent to a system with SPADs and timing circuitry in one chip

    Reflectance Intensity Assisted Automatic and Accurate Extrinsic Calibration of 3D LiDAR and Panoramic Camera Using a Printed Chessboard

    Full text link
    This paper presents a novel method for fully automatic and convenient extrinsic calibration of a 3D LiDAR and a panoramic camera with a normally printed chessboard. The proposed method is based on the 3D corner estimation of the chessboard from the sparse point cloud generated by one frame scan of the LiDAR. To estimate the corners, we formulate a full-scale model of the chessboard and fit it to the segmented 3D points of the chessboard. The model is fitted by optimizing the cost function under constraints of correlation between the reflectance intensity of laser and the color of the chessboard's patterns. Powell's method is introduced for resolving the discontinuity problem in optimization. The corners of the fitted model are considered as the 3D corners of the chessboard. Once the corners of the chessboard in the 3D point cloud are estimated, the extrinsic calibration of the two sensors is converted to a 3D-2D matching problem. The corresponding 3D-2D points are used to calculate the absolute pose of the two sensors with Unified Perspective-n-Point (UPnP). Further, the calculated parameters are regarded as initial values and are refined using the Levenberg-Marquardt method. The performance of the proposed corner detection method from the 3D point cloud is evaluated using simulations. The results of experiments, conducted on a Velodyne HDL-32e LiDAR and a Ladybug3 camera under the proposed re-projection error metric, qualitatively and quantitatively demonstrate the accuracy and stability of the final extrinsic calibration parameters.Comment: 20 pages, submitted to the journal of Remote Sensin

    Multi-FEAT: Multi-Feature Edge AlignmenT for Targetless Camera-LiDAR Calibration

    Full text link
    The accurate environment perception of automobiles and UAVs (Unmanned Ariel Vehicles) relies on the precision of onboard sensors, which require reliable in-field calibration. This paper introduces a novel approach for targetless camera-LiDAR extrinsic calibration called Multi-FEAT (Multi-Feature Edge AlignmenT). Multi-FEAT uses the cylindrical projection model to transform the 2D(Camera)-3D(LiDAR) calibration problem into a 2D-2D calibration problem, and exploits various LiDAR feature information to supplement the sparse LiDAR point cloud boundaries. In addition, a feature matching function with a precision factor is designed to improve the smoothness of the solution space. The performance of the proposed Multi-FEAT algorithm is evaluated using the KITTI dataset, and our approach shows more reliable results, as compared with several existing targetless calibration methods. We summarize our results and present potential directions for future work

    Comprehensive Extrinsic Calibration of a Camera and a 2D Laser Scanner for a Ground Vehicle

    Get PDF
    Cameras and laser scanners are two important kinds of perceptive sensors and both become more and more commonly used for intelligent ground vehicles; the calibration of these sensors is a fundamental task. A new method is proposed to perform COMPREHENSIVE extrinsic calibration of a SINGLE camera-2D laser scanner pair, i.e. the process of revealing ALL the spatial relationships among the camera coordinates system, the laser scanner coordinates system, the ground coordinates system, and the vehicle coordinates system. The proposed method is mainly based on the convenient and widely used chessboard calibration practice and can be conveniently implemented. The proposed method has been tested on both synthetic data and real data based experiments, which validate the effectiveness of the proposed method.La caméra et le scanner laser sont deux types importants de capteurs perceptifs et tous les deux deviennent de plus en plus communs pour de nombreuses applications des véhicules intelligents. La calibration de ces capteurs est une tâche fondamentale. Dans ce rapport, on a propose une nouvelle méthode pour réaliser la calibration extrinsèque compréhensive d'une seule paire caméra-scanner laser 2D, à savoir le procédé de révéler tous les relations spatiales parmi un système de coordonnées caméra, un système de coordonnées scanner laser, un système de coordonnées terrestre, et un système de coordonnées véhicule. La méthode proposée se fonde principalement sur la practique de cabliration au damier et est facile à mettre en œuvre. Des tests des données réelles et des données synthétiques ont validé la performance de la méthode proposée
    • …
    corecore