8 research outputs found

    Sparse local submap joining filter for building large-scale maps

    Full text link
    This paper presents a novel local submap joining algorithm for building large-scale feature-based maps: sparse local submap joining filter (SLSJF). The input to the filter is a sequence of local submaps. Each local submap is represented in a coordinate frame defined by the robot pose at which the map is initiated. The local submap state vector consists of the positions of all the local features and the final robot pose within the submap. The output of the filter is a global map containing the global positions of all the features as well as all the robot start/end poses of the local submaps. Use of an extended information filter (EIF) for fusing submaps makes the information matrix associated with SLSJF exactly sparse. The sparse structure together with a novel state vector and covariance submatrix recovery technique makes the SLSJF computationally very efficient. The SLSJF is a canonical and efficient submap joining solution for large-scale simultaneous localization and mapping (SLAM) problems that makes use of consistent local submaps generated by any reliable SLAM algorithm. The effectiveness and efficiency of the new algorithm is verified through computer simulations and experiments. © 2008 IEEE

    Generic Node Removal for Factor-Graph SLAM

    Full text link

    Long-Term Simultaneous Localization and Mapping in Dynamic Environments.

    Full text link
    One of the core competencies required for autonomous mobile robotics is the ability to use sensors to perceive the environment. From this noisy sensor data, the robot must build a representation of the environment and localize itself within this representation. This process, known as simultaneous localization and mapping (SLAM), is a prerequisite for almost all higher-level autonomous behavior in mobile robotics. By associating the robot's sensory observations as it moves through the environment, and by observing the robot's ego-motion through proprioceptive sensors, constraints are placed on the trajectory of the robot and the configuration of the environment. This results in a probabilistic optimization problem to find the most likely robot trajectory and environment configuration given all of the robot's previous sensory experience. SLAM has been well studied under the assumptions that the robot operates for a relatively short time period and that the environment is essentially static during operation. However, performing SLAM over long time periods while modeling the dynamic changes in the environment remains a challenge. The goal of this thesis is to extend the capabilities of SLAM to enable long-term autonomous operation in dynamic environments. The contribution of this thesis has three main components: First, we propose a framework for controlling the computational complexity of the SLAM optimization problem so that it does not grow unbounded with exploration time. Second, we present a method to learn visual feature descriptors that are more robust to changes in lighting, allowing for improved data association in dynamic environments. Finally, we use the proposed tools in SLAM systems that explicitly models the dynamics of the environment in the map by representing each location as a set of example views that capture how the location changes with time. We experimentally demonstrate that the proposed methods enable long-term SLAM in dynamic environments using a large, real-world vision and LIDAR dataset collected over the course of more than a year. This dataset captures a wide variety of dynamics: from short-term scene changes including moving people, cars, changing lighting, and weather conditions; to long-term dynamics including seasonal conditions and structural changes caused by construction.PhDElectrical Engineering: SystemsUniversity of Michigan, Horace H. Rackham School of Graduate Studieshttp://deepblue.lib.umich.edu/bitstream/2027.42/111538/1/carlevar_1.pd

    Improving visual SLAM by filtering outliers with the aid of optical flow

    Get PDF
    Ankara : The Department of Computer Engineering and the Graduate School of Engineering and Science of Bilkent University, 2011.Thesis (Master's) -- Bilkent University, 2011.Includes bibliographical references leaves 77-81.Simultaneous Localization and Mapping (SLAM) for mobile robots has been one of the challenging problems for the robotics community. Extensive study of this problem in recent years has somewhat saturated the theoretical and practical background on this topic. Within last few years, researches on SLAM have been headed towards Visual SLAM, in which camera is used as the primary sensor. Superior to many SLAM application run with planar robots, VSLAM allows us to estimate the 3D model of the environment and 6-DOF pose of the robot. Being applied to robotics only recently, VSLAM still has a lot of room for improvement. In particular, a common issue both in normal and Visual SLAM algorithms is the data association problem. Wrong data association either disturbs stability or result in divergence of the SLAM process. In this study, we propose two outlier elimination methods which use predicted feature location error and optical flow field. The former method asserts estimated landmark projection and its measurement locations to be close. The latter accepts optical flow field as a reference and compares the vector formed by consecutive matched feature locations; eliminates matches contradicting with the local optical flow vector field. We have shown these two methods to be saving VSLAM from divergence and improving its overall performance. We have also described our new modular SLAM library, SLAM++.Özaslan, TolgaM.S

    Toward autonomous harbor surveillance

    Get PDF
    Thesis (S.M.)--Massachusetts Institute of Technology, Dept. of Electrical Engineering and Computer Science, 2010.Includes bibliographical references (p. 105-113).In this thesis we address the problem of drift-free navigation for underwater vehicles performing harbor surveillance and ship hull inspection. Maintaining accurate localization for the duration of a mission is important for a variety of tasks, such as planning the vehicle trajectory and ensuring coverage of the area to be inspected. Our approach uses only onboard sensors in a simultaneous localization and mapping setting and removes the need for any external infrastructure like acoustic beacons. We extract dense features from a forward-looking imaging sonar and apply pair-wise registration between sonar frames. The registrations are combined with onboard velocity, attitude and acceleration sensors to obtain an improved estimate of the vehicle trajectory. In addition, an architecture for a persistent mapping is proposed. With the intention of handling long term operations and repetitive surveillance tasks. The proposed architecture is flexible and supports different types of vehicles and mapping methods. The design of the system is demonstrated with an implementation of some of the key features of the system. In addition, methods for re-localization are considered. Finally, results from several experiments that demonstrate drift-free navigation in various underwater environments are presented.by Hordur Johannsson.S.M

    Efficient 6-DOF SLAM with Treemap as a Generic Backend

    No full text

    Efficient 6-DOF SLAM with Treemap as a Generic Backend

    No full text
    Abstract — Treemap is a generic SLAM algorithm that has been successfully used to estimate extremely large 2D maps closing a loop over a million landmarks in 442ms. We are currently working on an open-source implementation that can handle most variants of SLAM. In this paper we discuss the generic part of the algorithm constituting the treemap backend and the variant specific parts acting as a driver. We present their interplay from a software-engineering point of view and show results for the case of 6-DOF feature based SLAM, closing a simulated loop over 106657 3D features in 209ms. I
    corecore