4 research outputs found
Robust Legged Robot State Estimation Using Factor Graph Optimization
Legged robots, specifically quadrupeds, are becoming increasingly attractive
for industrial applications such as inspection. However, to leave the
laboratory and to become useful to an end user requires reliability in harsh
conditions. From the perspective of state estimation, it is essential to be
able to accurately estimate the robot's state despite challenges such as uneven
or slippery terrain, textureless and reflective scenes, as well as dynamic
camera occlusions. We are motivated to reduce the dependency on foot contact
classifications, which fail when slipping, and to reduce position drift during
dynamic motions such as trotting. To this end, we present a factor graph
optimization method for state estimation which tightly fuses and smooths
inertial navigation, leg odometry and visual odometry. The effectiveness of the
approach is demonstrated using the ANYmal quadruped robot navigating in a
realistic outdoor industrial environment. This experiment included trotting,
walking, crossing obstacles and ascending a staircase. The proposed approach
decreased the relative position error by up to 55% and absolute position error
by 76% compared to kinematic-inertial odometry.Comment: 8 pages, 12 figures. Accepted to RA-L + IROS 2019, July 201
Preintegrated Velocity Bias Estimation to Overcome Contact Nonlinearities in Legged Robot Odometry
In this paper, we present a novel factor graph formulation to estimate the
pose and velocity of a quadruped robot on slippery and deformable terrain. The
factor graph introduces a preintegrated velocity factor that incorporates
velocity inputs from leg odometry and also estimates related biases. From our
experimentation we have seen that it is difficult to model uncertainties at the
contact point such as slip or deforming terrain, as well as leg flexibility. To
accommodate for these effects and to minimize leg odometry drift, we extend the
robot's state vector with a bias term for this preintegrated velocity factor.
The bias term can be accurately estimated thanks to the tight fusion of the
preintegrated velocity factor with stereo vision and IMU factors, without which
it would be unobservable. The system has been validated on several scenarios
that involve dynamic motions of the ANYmal robot on loose rocks, slopes and
muddy ground. We demonstrate a 26% improvement of relative pose error compared
to our previous work and 52% compared to a state-of-the-art proprioceptive
state estimator.Comment: Accepted to ICRA 2020. Video: youtu.be/w1Sx6dIqgQ