This paper proposes a novel data-driven approach for inertial navigation,
which learns to estimate trajectories of natural human motions just from an
inertial measurement unit (IMU) in every smartphone. The key observation is
that human motions are repetitive and consist of a few major modes (e.g.,
standing, walking, or turning). Our algorithm regresses a velocity vector from
the history of linear accelerations and angular velocities, then corrects
low-frequency bias in the linear accelerations, which are integrated twice to
estimate positions. We have acquired training data with ground-truth motions
across multiple human subjects and multiple phone placements (e.g., in a bag or
a hand). The qualitatively and quantitatively evaluations have demonstrated
that our algorithm has surprisingly shown comparable results to full Visual
Inertial navigation. To our knowledge, this paper is the first to integrate
sophisticated machine learning techniques with inertial navigation, potentially
opening up a new line of research in the domain of data-driven inertial
navigation. We will publicly share our code and data to facilitate further
research