81 research outputs found

    Methods for Wheel Slip and Sinkage Estimation in Mobile Robots

    Get PDF
    Future outdoor mobile robots will have to explore larger and larger areas, performing difficult tasks, while preserving, at the same time, their safety. This will primarily require advanced sensing and perception capabilities. Video sensors supply contact-free, precise measurements and are flexible devices that can be easily integrated with multi-sensor robotic platforms. Hence, they represent a potential answer to the need of new and improved perception capabilities for autonomous vehicles. One of the main applications of vision in mobile robotics is localization. For mobile robots operating on rough terrain, conventional dead reckoning techniques are not well suited, since wheel slipping, sinkage, and sensor drift may cause localization errors that accumulate without bound during the vehicle’s travel. Conversely, video sensors are exteroceptive devices, that is, they acquire information from the robot’s environment; therefore, vision-based motion estimates are independent of the knowledge of terrain properties and wheel-terrain interaction. Indeed, like dead reckoning, vision could lead to accumulation of errors; however, it has been proved that, compared to dead reckoning, it allows more accurate results and can be considered as a promising solution to the problem of robust robot positioning in high-slip environments. As a consequence, in the last few years, several localization methods using vision have been developed. Among them, visual odometry algorithms, based on the tracking of visual features over subsequent images, have been proved particularly effective. Accurate and reliable methods to sense slippage and sinkage are also desirable, since these effects compromise the vehicle’s traction performance, energy consumption and lead to gradual deviation of the robot from the intended path, possibly resulting in large drift and poor results of localization and control systems. For example, the use of conventional dead-reckoning technique is largely compromised, since it is based on the assumption that wheel revolutions can be translated into correspondent linear displacements. Thus, if one wheel slips, then the associated encoder will register revolutions even though these revolutions do not correspond to a linear displacement of the wheel. Conversely, if one wheel skids, fewer encoder pulses will be counted. Slippage and sinkage measurements are also valuable for terrain identification according to the classical terramechanics theory. This chapter investigates vision-based onboard technology to improve mobility of robots on natural terrain. A visual odometry algorithm and two methods for online measurement of vehicle slip angle and wheel sinkage, respectively, are discussed. Tests results are presented showing the performance of the proposed approaches using an all-terrain rover moving across uneven terrain

    Planetary Rover Inertial Navigation Applications: Pseudo Measurements and Wheel Terrain Interactions

    Get PDF
    Accurate localization is a critical component of any robotic system. During planetary missions, these systems are often limited by energy sources and slow spacecraft computers. Using proprioceptive localization (e.g., using an inertial measurement unit and wheel encoders) without external aiding is insufficient for accurate localization. This is mainly due to the integrated and unbounded errors of the inertial navigation solutions and the drifted position information from wheel encoders caused by wheel slippage. For this reason, planetary rovers often utilize exteroceptive (e.g., vision-based) sensors. On the one hand, localization with proprioceptive sensors is straightforward, computationally efficient, and continuous. On the other hand, using exteroceptive sensors for localization slows rover driving speed, reduces rover traversal rate, and these sensors are sensitive to the terrain features. Given the advantages and disadvantages of both methods, this thesis focuses on two objectives. First, improving the proprioceptive localization performance without significant changes to the rover operations. Second, enabling adaptive traversability rate based on the wheel-terrain interactions while keeping the localization reliable. To achieve the first objective, we utilized the zero-velocity, zero-angular rate updates, and non-holonomicity of a rover to improve rover localization performance even with the limited available sensor usage in a computationally efficient way. Pseudo-measurements generated from proprioceptive sensors when the rover is stationary conditions and the non-holonomic constraints while traversing can be utilized to improve the localization performance without any significant changes to the rover operations. Through this work, it is observed that a substantial improvement in localization performance, without the aid of additional exteroceptive sensor information. To achieve the second objective, the relationship between the estimation of localization uncertainty and wheel-terrain interactions through slip-ratio was investigated. This relationship was exposed with a Gaussian process with time series implementation by using the slippage estimation while the rover is moving. Then, it is predicted when to change from moving to stationary conditions by mapping the predicted slippage into localization uncertainty prediction. Instead of a periodic stopping framework, the method introduced in this work is a slip-aware localization method that enables the rover to stop more frequently in high-slip terrains whereas stops rover less frequently for low-slip terrains while keeping the proprioceptive localization reliable

    Adaptive Localization and Mapping for Planetary Rovers

    Get PDF
    Future rovers will be equipped with substantial onboard autonomy as space agencies and industry proceed with missions studies and technology development in preparation for the next planetary exploration missions. Simultaneous Localization and Mapping (SLAM) is a fundamental part of autonomous capabilities and has close connections to robot perception, planning and control. SLAM positively affects rover operations and mission success. The SLAM community has made great progress in the last decade by enabling real world solutions in terrestrial applications and is nowadays addressing important challenges in robust performance, scalability, high-level understanding, resources awareness and domain adaptation. In this thesis, an adaptive SLAM system is proposed in order to improve rover navigation performance and demand. This research presents a novel localization and mapping solution following a bottom-up approach. It starts with an Attitude and Heading Reference System (AHRS), continues with a 3D odometry dead reckoning solution and builds up to a full graph optimization scheme which uses visual odometry and takes into account rover traction performance, bringing scalability to modern SLAM solutions. A design procedure is presented in order to incorporate inertial sensors into the AHRS. The procedure follows three steps: error characterization, model derivation and filter design. A complete kinematics model of the rover locomotion subsystem is developed in order to improve the wheel odometry solution. Consequently, the parametric model predicts delta poses by solving a system of equations with weighed least squares. In addition, an odometry error model is learned using Gaussian processes (GPs) in order to predict non-systematic errors induced by poor traction of the rover with the terrain. The odometry error model complements the parametric solution by adding an estimation of the error. The gained information serves to adapt the localization and mapping solution to the current navigation demands (domain adaptation). The adaptivity strategy is designed to adjust the visual odometry computational load (active perception) and to influence the optimization back-end by including highly informative keyframes in the graph (adaptive information gain). Following this strategy, the solution is adapted to the navigation demands, providing an adaptive SLAM system driven by the navigation performance and conditions of the interaction with the terrain. The proposed methodology is experimentally verified on a representative planetary rover under realistic field test scenarios. This thesis introduces a modern SLAM system which adapts the estimated pose and map to the predicted error. The system maintains accuracy with fewer nodes, taking the best of both wheel and visual methods in a consistent graph-based smoothing approach

    Vision-based Estimation of Slip Angle for Mobile Robots and Planetary Rovers

    Get PDF
    2008 IEEE International Conference on Robotics and Automation, Pasadena, CA, USA, May 19-23, 200

    Encoderless position estimation and error correction techniques for miniature mobile robots

    Get PDF
    This paper presents an encoderless position estimation technique for miniature-sized mobile robots. Odometry techniques, which are based on the hardware components, are commonly used for calculating the geometric location of mobile robots. Therefore, the robot must be equipped with an appropriate sensor to measure the motion. However, due to the hardware limitations of some robots, employing extra hardware is impossible. On the other hand, in swarm robotic research, which uses a large number of mobile robots, equipping the robots with motion sensors might be costly. In this study, the trajectory of the robot is divided into several small displacements over short spans of time. Therefore, the position of the robot is calculated within a short period, using the speed equations of the robot's wheel. In addition, an error correction function is proposed that estimates the errors of the motion using a current monitoring technique. The experiments illustrate the feasibility of the proposed position estimation and error correction techniques to be used in miniature-sized mobile robots without requiring an additional sensor

    Исследование эффектов проскальзывания при навигации мобильного робота при движении по неоднородным поверхностям

    Get PDF
    В работе представлено исследование проскальзывания колёс мобильного робота в задаче навигации при движении по неоднородной поверхности. Были проведены эксперименты по получению данных о движении робота по поверхностям с различным свойствами. На основе экспериментальных данных были построены модели зависимости коэффициента проскальзывания колеса от тока двигателя с учётом нормированной угловой скорости колёс. Для фильтрации шумов в измерениях тока двигателей был настроен фильтр Калмана. В заключении была проведена апробация работы системы оценки проскальзывания колёс мобильного робота.This paper presents a study of the wheel slippage of a mobile robot in a navigation problem while moving on a heterogeneous surface. Experiments were carried out to obtain data on the motion of the robot on surfaces with different properties. Models of the relationship between wheel slippage and motor current, taking into account the normalized angular velocity of the wheels, were constructed based on the experimental data. A Kalman filter was tuned to filter the noise in the motor current measurements. Finally, the wheel slippage estimation system of the mobile robot was tested

    Planetary rovers and data fusion

    Get PDF
    This research will investigate the problem of position estimation for planetary rovers. Diverse algorithmic filters are available for collecting input data and transforming that data to useful information for the purpose of position estimation process. The terrain has sandy soil which might cause slipping of the robot, and small stones and pebbles which can affect trajectory. The Kalman Filter, a state estimation algorithm was used for fusing the sensor data to improve the position measurement of the rover. For the rover application the locomotion and errors accumulated by the rover is compensated by the Kalman Filter. The movement of a rover in a rough terrain is challenging especially with limited sensors to tackle the problem. Thus, an initiative was taken to test drive the rover during the field trial and expose the mobile platform to hard ground and soft ground(sand). It was found that the LSV system produced speckle image and values which proved invaluable for further research and for the implementation of data fusion. During the field trial,It was also discovered that in a at hard surface the problem of the steering rover is minimal. However, when the rover was under the influence of soft sand the rover tended to drift away and struggled to navigate. This research introduced the laser speckle velocimetry as an alternative for odometric measurement. LSV data was gathered during the field trial to further simulate under MATLAB, which is a computational/mathematical programming software used for the simulation of the rover trajectory. The wheel encoders came with associated errors during the position measurement process. This was observed during the earlier field trials too. It was also discovered that the Laser Speckle Velocimetry measurement was able to measure accurately the position measurement but at the same time sensitivity of the optics produced noise which needed to be addressed as error problem. Though the rough terrain is found in Mars, this paper is applicable to a terrestrial robot on Earth. There are regions in Earth which have rough terrains and regions which are hard to measure with encoders. This is especially true concerning icy places like Antarctica, Greenland and others. The proposed implementation for the development of the locomotion system is to model a system for the position estimation through the use of simulation and collecting data using the LSV. Two simulations are performed, one is the differential drive of a two wheel robot and the second involves the fusion of the differential drive robot data and the LSV data collected from the rover testbed. The results have been positive. The expected contributions from the research work includes a design of a LSV system to aid the locomotion measurement system. Simulation results show the effect of different sensors and velocity of the robot. The kalman filter improves the position estimation process

    Vision-based estimation of slip angle for mobile robots and planetary rovers

    Full text link
    Abstract — For a mobile robot it is critical to detect and compensate for slippage, especially when driving in rough terrain environments. Due to its highly unpredictable nature, drift largely affects the accuracy of localization and control systems, even leading, in extreme cases, to the danger of vehicle entrapment with consequent mission failure. This paper presents a novel method for lateral slip estimation based on visually observing the trace produced by the wheels of the robot, during traverse of soft, deformable terrain, as that expected for lunar and planetary rovers. The proposed algorithm uses a robust Hough transform enhanced by fuzzy reasoning to estimate the angle of inclination of the wheel trace with respect to the vehicle reference frame. Any deviation of the wheel trace from the planned path of the robot suggests occurrence of sideslip that can be detected, and more interestingly, measured. This allows one to estimate the actual heading angle of the robot, usually referred to as the slip angle. The details of the various steps of the visual algorithm are presented and the results of experimental tests performed in the field with an all-terrain rover are shown, proving the method to be effective and robust. I
    corecore