Localization of pedestrians becomes a difficult task in situations where no measurements with respect to an established reference system, as it is provided by satellites when using GPS, are available. One possible approach to tackle this problem is to attach a suitable sensor to pedestrians and then to run a simultaneous localization and mapping (SLAM) algorithm in order to localize the sensor. A combination of an inertial measurement unit (IMU) with a monocular camera is a promising choice of sensors for an indoor pedestrian localization system since these sensors provide complementary measurements. This paper discusses two approaches to integrate visual and inertial information which differ mainly in the choice of reference coordinate system. A detailed description of both approaches is given and they are compared with respect to their performance on simulated and real measurement data
To submit an update or takedown request for this paper, please submit an Update/Correction/Removal Request.