Software and Hardware Infrastructure for Visual-Inertial SLAM

Abstract

One of the challenges faced by researchers in the field of robot localization and mapping is finding a reliable infrastructure to test their ideas. That infrastructure could be a simulation platform, suitable hardware, or a sensor interface. A useful simulation platform needs to capture the dynamics and the sensor modalities that meet the researchers’ needs. A suitable hardware needs to have the capability to navigate, sense the environments, and use onboard computers to run the software it was designed for. A sensor interface allows adapting and testing algorithms on novel sensors. In this research, we develop an essential hardware and software infrastructure for aiding the development and testing of visual-inertial Simultaneous Localization and Mapping (SLAM) systems. SLAM is a fundamental problem in robot navigation and enables constructing or updating a representation (map) of an environment utilizing sensors on board a robot while concurrently using that representation to localize the robot itself. In visual-inertial SLAM the onboard sensors are cameras (monocular or stereo) and an inertial measurement unit (IMU). The contribution of this thesis is threefold. First, we develop a hardware platform consisting of a real drone capable of running state-of-art metric-semantic SLAM; this infrastructure allows us to test advanced SLAM algorithms using real sensors and real robot dynamics. Second, we develop a multi-robot simulation platform that includes dynamically accurate, photo-realistic drones; this platform allows extending our tests to multi-robot SLAM systems. Finally, we develop a new sensor interface; in particular, we integrate and test an omnidirectional stereo frontend in Kimera, an open-source visual-inertial SLAM pipeline. The thesis presents the design, implementation, and testing of each contribution.M.Eng

    Similar works

    Full text

    thumbnail-image

    Available Versions