4 research outputs found

    Unevenness Point Descriptor for Terrain Analysis in Mobile Robot Applications

    Get PDF
    In recent years, the use of imaging sensors that produce a three-dimensional representation of the environment has become an efficient solution to increase the degree of perception of autonomous mobile robots. Accurate and dense 3D point clouds can be generated from traditional stereo systems and laser scanners or from the new generation of RGB-D cameras, representing a versatile, reliable and cost-effective solution that is rapidly gaining interest within the robotics community. For autonomous mobile robots, it is critical to assess the traversability of the surrounding environment, especially when driving across natural terrain. In this paper, a novel approach to detect traversable and non-traversable regions of the environment from a depth image is presented that could enhance mobility and safety through integration with localization, control and planning methods. The proposed algorithm is based on the analysis of the normal vector of a surface obtained through Principal Component Analysis and it leads to the definition of a novel, so defined, Unevenness Point Descriptor. Experimental results, obtained with vehicles operating in indoor and outdoor environments, are presented to validate this approach

    Methods for Wheel Slip and Sinkage Estimation in Mobile Robots

    Get PDF
    Future outdoor mobile robots will have to explore larger and larger areas, performing difficult tasks, while preserving, at the same time, their safety. This will primarily require advanced sensing and perception capabilities. Video sensors supply contact-free, precise measurements and are flexible devices that can be easily integrated with multi-sensor robotic platforms. Hence, they represent a potential answer to the need of new and improved perception capabilities for autonomous vehicles. One of the main applications of vision in mobile robotics is localization. For mobile robots operating on rough terrain, conventional dead reckoning techniques are not well suited, since wheel slipping, sinkage, and sensor drift may cause localization errors that accumulate without bound during the vehicle’s travel. Conversely, video sensors are exteroceptive devices, that is, they acquire information from the robot’s environment; therefore, vision-based motion estimates are independent of the knowledge of terrain properties and wheel-terrain interaction. Indeed, like dead reckoning, vision could lead to accumulation of errors; however, it has been proved that, compared to dead reckoning, it allows more accurate results and can be considered as a promising solution to the problem of robust robot positioning in high-slip environments. As a consequence, in the last few years, several localization methods using vision have been developed. Among them, visual odometry algorithms, based on the tracking of visual features over subsequent images, have been proved particularly effective. Accurate and reliable methods to sense slippage and sinkage are also desirable, since these effects compromise the vehicle’s traction performance, energy consumption and lead to gradual deviation of the robot from the intended path, possibly resulting in large drift and poor results of localization and control systems. For example, the use of conventional dead-reckoning technique is largely compromised, since it is based on the assumption that wheel revolutions can be translated into correspondent linear displacements. Thus, if one wheel slips, then the associated encoder will register revolutions even though these revolutions do not correspond to a linear displacement of the wheel. Conversely, if one wheel skids, fewer encoder pulses will be counted. Slippage and sinkage measurements are also valuable for terrain identification according to the classical terramechanics theory. This chapter investigates vision-based onboard technology to improve mobility of robots on natural terrain. A visual odometry algorithm and two methods for online measurement of vehicle slip angle and wheel sinkage, respectively, are discussed. Tests results are presented showing the performance of the proposed approaches using an all-terrain rover moving across uneven terrain

    Исследование эффектов проскальзывания при навигации мобильного робота при движении по неоднородным поверхностям

    Get PDF
    В работе представлено исследование проскальзывания колёс мобильного робота в задаче навигации при движении по неоднородной поверхности. Были проведены эксперименты по получению данных о движении робота по поверхностям с различным свойствами. На основе экспериментальных данных были построены модели зависимости коэффициента проскальзывания колеса от тока двигателя с учётом нормированной угловой скорости колёс. Для фильтрации шумов в измерениях тока двигателей был настроен фильтр Калмана. В заключении была проведена апробация работы системы оценки проскальзывания колёс мобильного робота.This paper presents a study of the wheel slippage of a mobile robot in a navigation problem while moving on a heterogeneous surface. Experiments were carried out to obtain data on the motion of the robot on surfaces with different properties. Models of the relationship between wheel slippage and motor current, taking into account the normalized angular velocity of the wheels, were constructed based on the experimental data. A Kalman filter was tuned to filter the noise in the motor current measurements. Finally, the wheel slippage estimation system of the mobile robot was tested

    Odometry Correction Using Visual Slip Angle Estimation for Planetary Exploration Rovers

    No full text
    This paper introduces a novel method for slip angle estimation based on visually observing the traces produced by the wheels of a robot on soft, deformable terrain. The proposed algorithm uses a robust Hough transform enhanced by fuzzy reasoning to estimate the angle of inclination of the wheel trace with respect to the vehicle reference frame. Any deviation of the wheel track from the planned path of the robot suggests occurrence of sideslip that can be detected and, more interestingly, measured. In turn, the knowledge of the slip angle allows encoder readings affected by wheel slip to be adjusted and the accuracy of the position estimation system to be improved, based on an integrated longitudinal and lateral wheel–terrain slip model. The description of the visual algorithm and the odometry correction method is presented, and a comprehensive set of experimental results is included to validate this approach
    corecore