1,989 research outputs found

    A profile measurement system for rail quality assessment during manufacturing

    Get PDF
    Steel rails used in the transport sector and in industry are designed and manufactured to support high stress levels generated by high-speed and heavy-loaded modern trains. In the rail manufacturing process, one of the key stages is rolling, where fast, accurate and repeatable rail profile measurement is a major challenge. In this paper, a rail profile measurement system for rail rolling mills based on four conventional, inexpensive laser range finders is proposed. The range finders are calibrated using a common reference to properly express the point clouds generated by each range finder in the world coordinate system. The alignment of the point clouds to the rail model is performed by means of an efficient and robust registration method. Experiments carried out in a rail rolling mill demonstrate the accuracy and repeatability of the system; the maximum error is below 0.12%. All parallelizable tasks were designed and developed to be executed concurrently, achieving an acquisition rate of up to 210 fp

    Simple and accurate empirical absolute volume calibration of a multi-sensor fringe projection system

    Get PDF
    This paper suggests a novel absolute empirical calibration method for a multi-sensor fringe projection system. The optical setup of the projector-camera sensor can be arbitrary. The term absolute calibration here means that the centre of the three dimensional coordinates in the resultant calibrated volume coincides with a preset centre to the three-dimensional real-world coordinate system. The use of a zero-phase fringe marking spot is proposed to increase depth calibration accuracy, where the spot centre is determined with sub-pixel accuracy. Also, a new method is proposed for transversal calibration. Depth and transversal calibration methods have been tested using both single sensor and three-sensor fringe projection systems. The standard deviation of the error produced by this system is 0.25 mm. The calibrated volume produced by this method is 400 mm×400 mm×140 m

    A Full Scale Camera Calibration Technique with Automatic Model Selection – Extension and Validation

    Get PDF
    This thesis presents work on the testing and development of a complete camera calibration approach which can be applied to a wide range of cameras equipped with normal, wide-angle, fish-eye, or telephoto lenses. The full scale calibration approach estimates all of the intrinsic and extrinsic parameters. The calibration procedure is simple and does not require prior knowledge of any parameters. The method uses a simple planar calibration pattern. Closed-form estimates for the intrinsic and extrinsic parameters are computed followed by nonlinear optimization. Polynomial functions are used to describe the lens projection instead of the commonly used radial model. Statistical information criteria are used to automatically determine the complexity of the lens distortion model. In the first stage experiments were performed to verify and compare the performance of the calibration method. Experiments were performed on a wide range of lenses. Synthetic data was used to simulate real data and validate the performance. Synthetic data was also used to validate the performance of the distortion model selection which uses Information Theoretic Criterion (AIC) to automatically select the complexity of the distortion model. In the second stage work was done to develop an improved calibration procedure which addresses shortcomings of previously developed method. Experiments on the previous method revealed that the estimation of the principal point during calibration was erroneous for lenses with a large focal length. To address this issue the calibration method was modified to include additional methods to accurately estimate the principal point in the initial stages of the calibration procedure. The modified procedure can now be used to calibrate a wide spectrum of imaging systems including telephoto and verifocal lenses. Survey of current work revealed a vast amount of research concentrating on calibrating only the distortion of the camera. In these methods researchers propose methods to calibrate only the distortion parameters and suggest using other popular methods to find the remaining camera parameters. Using this proposed methodology we apply distortion calibration to our methods to separate the estimation of distortion parameters. We show and compare the results with the original method on a wide range of imaging systems

    Calibrating a Robot Camera

    Full text link
    This paper addresses the problem of calibrating a camera mounted on a robot arm. The objective is to estimate the camera's intrinsic and extrinsic parameters. These include the relative position and orientation of camera with respect to robot base as well as the relative position and orientation of the camera with respect to a pre-defined world frame. A calibration object with a known 3D shape is used together with two known movements of the robot. A method is presented to find calibration parameters within an opti-misation framework. This method differs from existing methods in that 1) it fully exploits information from different displacements of the camera to produce an optimal calibration estimate, and 2) it uses an evolutionary algo-rithm to attain the optimal solution. Experimental results on both synthetic and real data are presented.

    Continuous Online Extrinsic Calibration of Fisheye Camera and LiDAR

    Full text link
    Automated driving systems use multi-modal sensor suites to ensure the reliable, redundant and robust perception of the operating domain, for example camera and LiDAR. An accurate extrinsic calibration is required to fuse the camera and LiDAR data into a common spatial reference frame required by high-level perception functions. Over the life of the vehicle the value of the extrinsic calibration can change due physical disturbances, introducing an error into the high-level perception functions. Therefore there is a need for continuous online extrinsic calibration algorithms which can automatically update the value of the camera-LiDAR calibration during the life of the vehicle using only sensor data. We propose using mutual information between the camera image's depth estimate, provided by commonly available monocular depth estimation networks, and the LiDAR pointcloud's geometric distance as a optimization metric for extrinsic calibration. Our method requires no calibration target, no ground truth training data and no expensive offline optimization. We demonstrate our algorithm's accuracy, precision, speed and self-diagnosis capability on the KITTI-360 data set.Comment: 4 page
    • 

    corecore