Exactly estimating and tracking the motion of surrounding dynamic objects is
one of important tasks for the autonomy of a quadruped manipulator. However,
with only an onboard RGB camera, it is still a challenging work for a quadruped
manipulator to track the motion of a dynamic object moving with unknown and
changing velocities. To address this problem, this manuscript proposes a novel
image-based visual servoing (IBVS) approach consisting of three elements: a
spherical projection model, a robust super-twisting observer, and a model
predictive controller (MPC). The spherical projection model decouples the
visual error of the dynamic target into linear and angular ones. Then, with the
presence of the visual error, the robustness of the observer is exploited to
estimate the unknown and changing velocities of the dynamic target without
depth estimation. Finally, the estimated velocity is fed into the model
predictive controller (MPC) to generate joint torques for the quadruped
manipulator to track the motion of the dynamical target. The proposed approach
is validated through hardware experiments and the experimental results
illustrate the approach's effectiveness in improving the autonomy of the
quadruped manipulator