12,470 research outputs found

    Joint on-manifold self-calibration of odometry model and sensor extrinsics using pre-integration

    Get PDF
    © 2019 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works.This paper describes a self-calibration procedure that jointly estimates the extrinsic parameters of an exteroceptive sensor able to observe ego-motion, and the intrinsic parameters of an odometry motion model, consisting of wheel radii and wheel separation. We use iterative nonlinear onmanifold optimization with a graphical representation of the state, and resort to an adaptation of the pre-integration theory, initially developed for the IMU motion sensor, to be applied to the differential drive motion model. For this, we describe the construction of a pre-integrated factor for the differential drive motion model, which includes the motion increment, its covariance, and a first-order approximation of its dependence with the calibration parameters. As the calibration parameters change at each solver iteration, this allows a posteriori factor correction without the need of re-integrating the motion data. We validate our proposal in simulations and on a real robot and show the convergence of the calibration towards the true values of the parameters. It is then tested online in simulation and is shown to accommodate to variations in the calibration parameters when the vehicle is subject to physical changes such as loading and unloading a freight.Peer ReviewedPostprint (author's final draft

    Simultaneous Parameter Calibration, Localization, and Mapping

    Get PDF
    The calibration parameters of a mobile robot play a substantial role in navigation tasks. Often these parameters are subject to variations that depend either on changes in the environment or on the load of the robot. In this paper, we propose an approach to simultaneously estimate a map of the environment, the position of the on-board sensors of the robot, and its kinematic parameters. Our method requires no prior knowledge about the environment and relies only on a rough initial guess of the parameters of the platform. The proposed approach estimates the parameters online and it is able to adapt to non-stationary changes of the configuration. We tested our approach in simulated environments and on a wide range of real-world data using different types of robotic platforms. (C) 2012 Taylor & Francis and The Robotics Society of Japa

    Simultaneous maximum-likelihood calibration of odometry and sensor parameters

    Get PDF
    For a differential-drive mobile robot equipped with an on-board range sensor, there are six parameters to calibrate: three for the odometry (radii and distance between the wheels), and three for the pose of the sensor with respect to the robot frame. This paper describes a method for calibrating all six parameters at the same time, without the need for external sensors or devices. Moreover, it is not necessary to drive the robot along particular trajectories. The available data are the measures of the angular velocities of the wheels and the range sensor readings. The maximum-likelihood calibration solution is found in a closed form

    Dynamic Motion Modelling for Legged Robots

    Full text link
    An accurate motion model is an important component in modern-day robotic systems, but building such a model for a complex system often requires an appreciable amount of manual effort. In this paper we present a motion model representation, the Dynamic Gaussian Mixture Model (DGMM), that alleviates the need to manually design the form of a motion model, and provides a direct means of incorporating auxiliary sensory data into the model. This representation and its accompanying algorithms are validated experimentally using an 8-legged kinematically complex robot, as well as a standard benchmark dataset. The presented method not only learns the robot's motion model, but also improves the model's accuracy by incorporating information about the terrain surrounding the robot
    corecore