1 research outputs found
AcinoSet: A 3D Pose Estimation Dataset and Baseline Models for Cheetahs in the Wild
Animals are capable of extreme agility, yet understanding their complex
dynamics, which have ecological, biomechanical and evolutionary implications,
remains challenging. Being able to study this incredible agility will be
critical for the development of next-generation autonomous legged robots. In
particular, the cheetah (acinonyx jubatus) is supremely fast and maneuverable,
yet quantifying its whole-body 3D kinematic data during locomotion in the wild
remains a challenge, even with new deep learning-based methods. In this work we
present an extensive dataset of free-running cheetahs in the wild, called
AcinoSet, that contains 119,490 frames of multi-view synchronized high-speed
video footage, camera calibration files and 7,588 human-annotated frames. We
utilize markerless animal pose estimation to provide 2D keypoints. Then, we use
three methods that serve as strong baselines for 3D pose estimation tool
development: traditional sparse bundle adjustment, an Extended Kalman Filter,
and a trajectory optimization-based method we call Full Trajectory Estimation.
The resulting 3D trajectories, human-checked 3D ground truth, and an
interactive tool to inspect the data is also provided. We believe this dataset
will be useful for a diverse range of fields such as ecology, neuroscience,
robotics, biomechanics as well as computer vision.Comment: Code and data can be found at:
https://github.com/African-Robotics-Unit/AcinoSe