304 research outputs found

    Implementation issues and experimental evaluation of D-SLAM

    Full text link
    © Springer-Verlag Berlin Heidelberg 2006. D-SLAM algorithm first described in [1.] allows SLAM to be decoupled into solving a non-linear static estimation problem for mapping and a three-dimensional estimation problem for localization. This paper presents a new version of the D-SLAM algorithm that uses an absolute map instead of a relative map as presented in [1.]. One of the significant advantages of D-SLAM algorithm is its O (N) computational cost where N is the total number of features (landmarks). The theoretical foundations of D-SLAM together with implementation issues including data association, state recovery, and computational complexity are addressed in detail. Evaluation of the D-SLAM algorithm is provided using both real experimental data and simulations

    SLAM with Corner Features Based on a Relative Map

    Get PDF
    This paper presents a solution to the Simultaneous Localization and Mapping (SLAM) problem in the stochastic map framework for a mobile robot navigating in an indoor environment. The approach is based on the concept of the relative map. The idea consists in introducing a map state, which only contains quantities invariant under translation and rotation. This is done in order to have a decoupling between the robot motion and the landmark estimation and therefore not to rely the landmark estimation on the unmodeled error sources of the robot motion. The case of the corner feature is here considered. The relative state estimated through the Kalman filter contains the distances and the relative orientations among the corners observed at the same time. Therefore, this state is invariant with respect to the robot configuration (translation and rotation). Finally, an environment containing structures consisting of several corners is also investigated. Real experiments carried out with a mobile robot equipped with a 360 deg laser range finder show the performance of the approach

    Open Challenges in SLAM: An Optimal Solution Based on Shift and Rotation Invariants

    Get PDF
    This paper starts with a discussion of the open challenges in the SLAM problem. In our opinion they can be grouped in two main and distinct areas: convergence of the built map and computation requirement for real world application. To deal with the previous problems, a solution in the stochastic map framework based on the concept of the relative map is proposed. The idea consists in introducing a map state, which only contains quantities invariant under shift and rotation and to carry out the estimation of this relative map in an optimal way. This is a possible way in order to have a decoupling between the robot motion and the landmark estimation and therefore not to rely the landmark estimation on the unmodeled error sources of the robot motion. Moreover, the proposed solution scales linearly with the number of landmark allowing real-time application. Experimental results, carried out on a real platform, show the better performance of this method with respect to the joint vehicle-landmark approach (absolute map fflter) when the odometry is affected by undetected systematic errors or by large or unmodeled non-systematic errors

    EVOLIN Benchmark: Evaluation of Line Detection and Association

    Full text link
    Lines are interesting geometrical features commonly seen in indoor and urban environments. There is missing a complete benchmark where one can evaluate lines from a sequential stream of images in all its stages: Line detection, Line Association and Pose error. To do so, we present a complete and exhaustive benchmark for visual lines in a SLAM front-end, both for RGB and RGBD, by providing a plethora of complementary metrics. We have also labelled data from well-known SLAM datasets in order to have all in one poses and accurately annotated lines. In particular, we have evaluated 17 line detection algorithms, 5 line associations methods and the resultant pose error for aligning a pair of frames with several combinations of detector-association. We have packaged all methods and evaluations metrics and made them publicly available on web-page https://prime-slam.github.io/evolin/

    D-SLAM: Decoupled localization and mapping for autonomous robots

    Full text link
    The main contribution of this paper is the reformulation of the simultaneous localization and mapping (SLAM) problem for mobile robots such that the mapping and localization can be treated as two concurrent yet separated processes: D-SLAM (decoupled SLAM). It is shown that SLAM can be decoupled into solving a non-linear static estimation problem for mapping and a low-dimensional dynamic estimation problem for localization. The mapping problem can be solved using an Extended Information Filter where the information matrix is shown to be exactly sparse. A significant saving in the computational effort can be achieved for large scale problems by exploiting the special properties of sparse matrices. An important feature of D-SLAM is that the correlation among landmarks are still kept and it is demonstrated that the uncertainty of the map landmarks monotonically decrease. The algorithm is illustrated through computer simulations and experiments

    From Feature Detection in Truncated Signed Distance Fields to Sparse Stable Scene Graphs

    Full text link

    Have I seen this place before? A fast and robust loop detection and correction method for 3D Lidar SLAM

    Get PDF
    In this paper, we present a complete loop detection and correction system developed for data originating from lidar scanners. Regarding detection, we propose a combination of a global point cloud matcher with a novel registration algorithm to determine loop candidates in a highly effective way. The registration method can deal with point clouds that are largely deviating in orientation while improving the efficiency over existing techniques. In addition, we accelerated the computation of the global point cloud matcher by a factor of 2–4, exploiting the GPU to its maximum. Experiments demonstrated that our combined approach more reliably detects loops in lidar data compared to other point cloud matchers as it leads to better precision–recall trade-offs: for nearly 100% recall, we gain up to 7% in precision. Finally, we present a novel loop correction algorithm that leads to an improvement by a factor of 2 on the average and median pose error, while at the same time only requires a handful of seconds to complete

    Slam based on quantities invariant of the robot's configuration

    Get PDF
    This paper presents a solution to the Simultaneous Localization and Mapping (SLAM) problem in the stochastic map framework for a mobile robot navigating in an indoor environment. The approach is based on the concept of the relative map. The idea consists in introducing a map state, which only contains quantities invariant under translation and rotation. In this way the landmark estimation is decoupled from the robot motion and therefore the estimation does not rely on the unmodeled error sources of the robot motion. A new landmark is introduced by considering the intersection point between two lines. Only landmarks whose position error is small are considered. In this way the intersection point is the natural extension of the corner feature. The relative state estimated through a Kalman filter contains the distances among the intersection points observed at the same time. Real experiments carried out with a mobile robot equipped with a 360o laser range finder show the performance of the approach

    D-SLAM: A decoupled solution to simultaneous localization and mapping

    Full text link
    The main contribution of this paper is the reformulation of the simultaneous localization and mapping (SLAM) problem for mobile robots such that the mapping and localization can be treated as two concurrent yet separated processes: D-SLAM (decoupled SLAM). It is shown that SLAM with a range and bearing sensor in an environment populated with point features can be decoupled into solving a nonlinear static estimation problem for mapping and a low-dimensional dynamic estimation problem for localization. This is achieved by transforming the measurement vector into two parts: one containing information relating features in the map and another with information relating the map and robot. It is shown that the new formulation results in an exactly sparse information matrix for mapping when it is solved using an Extended Information Filter (EIF).Thus a significant saving in the computational effort can be achieved for large-scale problems by exploiting the special properties of sparse matrices. An important feature of D-SLAM is that the correlation among features in the map are still kept and it is demonstrated that the uncertainty of the feature estimates monotonically decreases. The algorithm is illustrated and evaluated through computer simulations and experiments. © 2007 SAGE Publications

    Map-Based Localization for Unmanned Aerial Vehicle Navigation

    Get PDF
    Unmanned Aerial Vehicles (UAVs) require precise pose estimation when navigating in indoor and GNSS-denied / GNSS-degraded outdoor environments. The possibility of crashing in these environments is high, as spaces are confined, with many moving obstacles. There are many solutions for localization in GNSS-denied environments, and many different technologies are used. Common solutions involve setting up or using existing infrastructure, such as beacons, Wi-Fi, or surveyed targets. These solutions were avoided because the cost should be proportional to the number of users, not the coverage area. Heavy and expensive sensors, for example a high-end IMU, were also avoided. Given these requirements, a camera-based localization solution was selected for the sensor pose estimation. Several camera-based localization approaches were investigated. Map-based localization methods were shown to be the most efficient because they close loops using a pre-existing map, thus the amount of data and the amount of time spent collecting data are reduced as there is no need to re-observe the same areas multiple times. This dissertation proposes a solution to address the task of fully localizing a monocular camera onboard a UAV with respect to a known environment (i.e., it is assumed that a 3D model of the environment is available) for the purpose of navigation for UAVs in structured environments. Incremental map-based localization involves tracking a map through an image sequence. When the map is a 3D model, this task is referred to as model-based tracking. A by-product of the tracker is the relative 3D pose (position and orientation) between the camera and the object being tracked. State-of-the-art solutions advocate that tracking geometry is more robust than tracking image texture because edges are more invariant to changes in object appearance and lighting. However, model-based trackers have been limited to tracking small simple objects in small environments. An assessment was performed in tracking larger, more complex building models, in larger environments. A state-of-the art model-based tracker called ViSP (Visual Servoing Platform) was applied in tracking outdoor and indoor buildings using a UAVs low-cost camera. The assessment revealed weaknesses at large scales. Specifically, ViSP failed when tracking was lost, and needed to be manually re-initialized. Failure occurred when there was a lack of model features in the cameras field of view, and because of rapid camera motion. Experiments revealed that ViSP achieved positional accuracies similar to single point positioning solutions obtained from single-frequency (L1) GPS observations standard deviations around 10 metres. These errors were considered to be large, considering the geometric accuracy of the 3D model used in the experiments was 10 to 40 cm. The first contribution of this dissertation proposes to increase the performance of the localization system by combining ViSP with map-building incremental localization, also referred to as simultaneous localization and mapping (SLAM). Experimental results in both indoor and outdoor environments show sub-metre positional accuracies were achieved, while reducing the number of tracking losses throughout the image sequence. It is shown that by integrating model-based tracking with SLAM, not only does SLAM improve model tracking performance, but the model-based tracker alleviates the computational expense of SLAMs loop closing procedure to improve runtime performance. Experiments also revealed that ViSP was unable to handle occlusions when a complete 3D building model was used, resulting in large errors in its pose estimates. The second contribution of this dissertation is a novel map-based incremental localization algorithm that improves tracking performance, and increases pose estimation accuracies from ViSP. The novelty of this algorithm is the implementation of an efficient matching process that identifies corresponding linear features from the UAVs RGB image data and a large, complex, and untextured 3D model. The proposed model-based tracker improved positional accuracies from 10 m (obtained with ViSP) to 46 cm in outdoor environments, and improved from an unattainable result using VISP to 2 cm positional accuracies in large indoor environments. The main disadvantage of any incremental algorithm is that it requires the camera pose of the first frame. Initialization is often a manual process. The third contribution of this dissertation is a map-based absolute localization algorithm that automatically estimates the camera pose when no prior pose information is available. The method benefits from vertical line matching to accomplish a registration procedure of the reference model views with a set of initial input images via geometric hashing. Results demonstrate that sub-metre positional accuracies were achieved and a proposed enhancement of conventional geometric hashing produced more correct matches - 75% of the correct matches were identified, compared to 11%. Further the number of incorrect matches was reduced by 80%
    • …
    corecore