26,125 research outputs found

    Systematic Odometry Error Evaluation and Correction in a Human-Sized Three-Wheeled Omnidirectional Mobile Robot Using Flower-Shaped Calibration Trajectories

    Get PDF
    Odometry is a simple and practical method that provides a periodic real-time estimation of the relative displacement of a mobile robot based on the measurement of the angular rotational speed of its wheels. The main disadvantage of odometry is its unbounded accumulation of errors, a factor that reduces the accuracy of the estimation of the absolute position and orientation of a mobile robot. This paper proposes a general procedure to evaluate and correct the systematic odometry errors of a human-sized three-wheeled omnidirectional mobile robot designed as a versatile personal assistant tool. The correction procedure is based on the definition of 36 individual calibration trajectories which together depict a flower-shaped figure, on the measurement of the odometry and ground truth trajectory of each calibration trajectory, and on the application of several strategies to iteratively adjust the effective value of the kinematic parameters of the mobile robot in order to match the estimated final position from these two trajectories. The results have shown an average improvement of 82.14% in the estimation of the final position and orientation of the mobile robot. Therefore, these results can be used for odometry calibration during the manufacturing of human-sized three-wheeled omnidirectional mobile robots

    Simultaneous maximum-likelihood calibration of odometry and sensor parameters

    Get PDF
    For a differential-drive mobile robot equipped with an on-board range sensor, there are six parameters to calibrate: three for the odometry (radii and distance between the wheels), and three for the pose of the sensor with respect to the robot frame. This paper describes a method for calibrating all six parameters at the same time, without the need for external sensors or devices. Moreover, it is not necessary to drive the robot along particular trajectories. The available data are the measures of the angular velocities of the wheels and the range sensor readings. The maximum-likelihood calibration solution is found in a closed form

    Construction and Calibration of a Low-Cost 3D Laser Scanner with 360◦ Field of View for Mobile Robots

    Get PDF
    Navigation of many mobile robots relies on environmental information obtained from three-dimensional (3D) laser scanners. This paper presents a new 360◦ field-of-view 3D laser scanner for mobile robots that avoids the high cost of commercial devices. The 3D scanner is based on spinning a Hokuyo UTM- 30LX-EX two-dimensional (2D) rangefinder around its optical center. The proposed design profits from lessons learned with the development of a previous 3D scanner with pitching motion. Intrinsic calibration of the new device has been performed to obtain both temporal and geometric parameters. The paper also shows the integration of the 3D device in the outdoor mobile robot Andabata.Universidad de Málaga. Campus de Excelencia Internacional Andalucía Tec

    Simultaneous Parameter Calibration, Localization, and Mapping

    Get PDF
    The calibration parameters of a mobile robot play a substantial role in navigation tasks. Often these parameters are subject to variations that depend either on changes in the environment or on the load of the robot. In this paper, we propose an approach to simultaneously estimate a map of the environment, the position of the on-board sensors of the robot, and its kinematic parameters. Our method requires no prior knowledge about the environment and relies only on a rough initial guess of the parameters of the platform. The proposed approach estimates the parameters online and it is able to adapt to non-stationary changes of the configuration. We tested our approach in simulated environments and on a wide range of real-world data using different types of robotic platforms. (C) 2012 Taylor & Francis and The Robotics Society of Japa

    Joint on-manifold self-calibration of odometry model and sensor extrinsics using pre-integration

    Get PDF
    © 2019 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works.This paper describes a self-calibration procedure that jointly estimates the extrinsic parameters of an exteroceptive sensor able to observe ego-motion, and the intrinsic parameters of an odometry motion model, consisting of wheel radii and wheel separation. We use iterative nonlinear onmanifold optimization with a graphical representation of the state, and resort to an adaptation of the pre-integration theory, initially developed for the IMU motion sensor, to be applied to the differential drive motion model. For this, we describe the construction of a pre-integrated factor for the differential drive motion model, which includes the motion increment, its covariance, and a first-order approximation of its dependence with the calibration parameters. As the calibration parameters change at each solver iteration, this allows a posteriori factor correction without the need of re-integrating the motion data. We validate our proposal in simulations and on a real robot and show the convergence of the calibration towards the true values of the parameters. It is then tested online in simulation and is shown to accommodate to variations in the calibration parameters when the vehicle is subject to physical changes such as loading and unloading a freight.Peer ReviewedPostprint (author's final draft

    Position Estimation of Robotic Mobile Nodes in Wireless Testbed using GENI

    Full text link
    We present a low complexity experimental RF-based indoor localization system based on the collection and processing of WiFi RSSI signals and processing using a RSS-based multi-lateration algorithm to determine a robotic mobile node's location. We use a real indoor wireless testbed called w-iLab.t that is deployed in Zwijnaarde, Ghent, Belgium. One of the unique attributes of this testbed is that it provides tools and interfaces using Global Environment for Network Innovations (GENI) project to easily create reproducible wireless network experiments in a controlled environment. We provide a low complexity algorithm to estimate the location of the mobile robots in the indoor environment. In addition, we provide a comparison between some of our collected measurements with their corresponding location estimation and the actual robot location. The comparison shows an accuracy between 0.65 and 5 meters.Comment: (c) 2016 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other work

    A Factor Graph Approach to Multi-Camera Extrinsic Calibration on Legged Robots

    Full text link
    Legged robots are becoming popular not only in research, but also in industry, where they can demonstrate their superiority over wheeled machines in a variety of applications. Either when acting as mobile manipulators or just as all-terrain ground vehicles, these machines need to precisely track the desired base and end-effector trajectories, perform Simultaneous Localization and Mapping (SLAM), and move in challenging environments, all while keeping balance. A crucial aspect for these tasks is that all onboard sensors must be properly calibrated and synchronized to provide consistent signals for all the software modules they feed. In this paper, we focus on the problem of calibrating the relative pose between a set of cameras and the base link of a quadruped robot. This pose is fundamental to successfully perform sensor fusion, state estimation, mapping, and any other task requiring visual feedback. To solve this problem, we propose an approach based on factor graphs that jointly optimizes the mutual position of the cameras and the robot base using kinematics and fiducial markers. We also quantitatively compare its performance with other state-of-the-art methods on the hydraulic quadruped robot HyQ. The proposed approach is simple, modular, and independent from external devices other than the fiducial marker.Comment: To appear on "The Third IEEE International Conference on Robotic Computing (IEEE IRC 2019)
    corecore