13 research outputs found

    Simultaneous Parameter Calibration, Localization, and Mapping

    Get PDF
    The calibration parameters of a mobile robot play a substantial role in navigation tasks. Often these parameters are subject to variations that depend either on changes in the environment or on the load of the robot. In this paper, we propose an approach to simultaneously estimate a map of the environment, the position of the on-board sensors of the robot, and its kinematic parameters. Our method requires no prior knowledge about the environment and relies only on a rough initial guess of the parameters of the platform. The proposed approach estimates the parameters online and it is able to adapt to non-stationary changes of the configuration. We tested our approach in simulated environments and on a wide range of real-world data using different types of robotic platforms. (C) 2012 Taylor & Francis and The Robotics Society of Japa

    Automatic Calibration of Multiple Coplanar Sensors

    Get PDF
    This paper describes an algorithm for recovering the rigid 3-DOF transformation (offset and rotation) between pairs of sensors mounted rigidly in a common plane on a mobile robot. The algorithm requires only a set of sensor observations made as the robot moves along a suitable path. Our method does not require synchronized sensors; nor does it require complete metrical reconstruction of the environment or the sensor path. We show that incremental pose measurements alone are sufficient to recover sensor calibration through nonlinear least squares estimation. We use the Fisher Information Matrix to compute a Cramer-Rao lower bound (CRLB) for the resulting calibration. Applying the algorithm in practice requires a non-degenerate motion path, a principled procedure for estimating per-sensopose displacements and their covariances, a way to temporally resample asynchronous sensor data, and a way to assess the quality of the recovered calibration. We give constructive methods for each step. We demonstrate and validate the end-to-end calibration procedure for both simulated and real LIDAR and inertial data, achieving CRLBs, and corresponding calibrations, accurate to millimeters and milliradians. Source code is available from http://rvsn.csail.mit.edu/calibration

    Comprehensive Extrinsic Calibration of a Camera and a 2D Laser Scanner for a Ground Vehicle

    Get PDF
    Cameras and laser scanners are two important kinds of perceptive sensors and both become more and more commonly used for intelligent ground vehicles; the calibration of these sensors is a fundamental task. A new method is proposed to perform COMPREHENSIVE extrinsic calibration of a SINGLE camera-2D laser scanner pair, i.e. the process of revealing ALL the spatial relationships among the camera coordinates system, the laser scanner coordinates system, the ground coordinates system, and the vehicle coordinates system. The proposed method is mainly based on the convenient and widely used chessboard calibration practice and can be conveniently implemented. The proposed method has been tested on both synthetic data and real data based experiments, which validate the effectiveness of the proposed method.La caméra et le scanner laser sont deux types importants de capteurs perceptifs et tous les deux deviennent de plus en plus communs pour de nombreuses applications des véhicules intelligents. La calibration de ces capteurs est une tâche fondamentale. Dans ce rapport, on a propose une nouvelle méthode pour réaliser la calibration extrinsèque compréhensive d'une seule paire caméra-scanner laser 2D, à savoir le procédé de révéler tous les relations spatiales parmi un système de coordonnées caméra, un système de coordonnées scanner laser, un système de coordonnées terrestre, et un système de coordonnées véhicule. La méthode proposée se fonde principalement sur la practique de cabliration au damier et est facile à mettre en œuvre. Des tests des données réelles et des données synthétiques ont validé la performance de la méthode proposée

    Range Information Characterization of the Hokuyo UST-20LX LIDAR Sensor

    Get PDF
    This paper presents a study on the data measurements that the Hokuyo UST-20LX Laser Rangefinder produces, which compiles into an overall characterization of the LiDAR sensor relative to indoor environments. The range measurements, beam divergence, angular resolution, error effect due to some common painted and wooden surfaces, and the error due to target surface orientation are analyzed. It was shown that using a statistical average of sensor measurements provides a more accurate range measurement. It was also shown that the major source of errors for the Hokuyo UST-20LX sensor was caused by something that will be referred to as “mixed pixels”. Additional error sources are target surface material, and the range relative to the sensor. The purpose of this paper was twofold: (1) to describe a series of tests that can be performed to characterize various aspects of a LIDAR system from a user perspective, and (2) present a detailed characterization of the commonly-used Hokuyo UST-20LX LIDAR sensor

    3D Perception Based Lifelong Navigation of Service Robots in Dynamic Environments

    Get PDF
    Lifelong navigation of mobile robots is to ability to reliably operate over extended periods of time in dynamically changing environments. Historically, computational capacity and sensor capability have been the constraining factors to the richness of the internal representation of the environment that a mobile robot could use for navigation tasks. With affordable contemporary sensing technology available that provides rich 3D information of the environment and increased computational power, we can increasingly make use of more semantic environmental information in navigation related tasks.A navigation system has many subsystems that must operate in real time competing for computation resources in such as the perception, localization, and path planning systems. The main thesis proposed in this work is that we can utilize 3D information from the environment in our systems to increase navigational robustness without making trade-offs in any of the real time subsystems. To support these claims, this dissertation presents robust, real world 3D perception based navigation systems in the domains of indoor doorway detection and traversal, sidewalk-level outdoor navigation in urban environments, and global localization in large scale indoor warehouse environments.The discussion of these systems includes methods of 3D point cloud based object detection to find respective objects of semantic interest for the given navigation tasks as well as the use of 3D information in the navigational systems for purposes such as localization and dynamic obstacle avoidance. Experimental results for each of these applications demonstrate the effectiveness of the techniques for robust long term autonomous operation
    corecore