43 research outputs found

    Stereo Visual SLAM Based on Unscented Dual Quaternion Filtering

    Get PDF

    Comparing LiDAR and IMU-based SLAM approaches for 3D robotic mapping

    Get PDF
    In this paper, we propose a comparison of open-source LiDAR and Inertial Measurement Unit (IMU)-based Simultaneous Localization and Mapping (SLAM) approaches for 3D robotic mapping. The analyzed algorithms are often exploited in mobile robotics for autonomous navigation but have not been evaluated in terms of 3D reconstruction yet. Experimental tests are carried out using two different autonomous mobile platforms in three test cases, comprising both indoor and outdoor scenarios. The 3D models obtained with the different SLAM algorithms are then compared in terms of density, accuracy, and noise of the point clouds to analyze the performance of the evaluated approaches. The experimental results indicate the SLAM methods that are more suitable for 3D mapping in terms of the quality of the reconstruction and highlight the feasibility of mobile robotics in the field of autonomous mapping

    Dual Quaternion Sample Reduction for SE(2) Estimation

    Get PDF
    We present a novel sample reduction scheme for random variables belonging to the SE(2) group by means of Dirac mixture approximation. For this, dual quaternions are employed to represent uncertain planar transformations. The Cramér–von Mises distance is modified as a smooth metric to measure the statistical distance between Dirac mixtures on the manifold of planar dual quaternions. Samples of reduced size are then obtained by minimizing the probability divergence via Riemannian optimization while interpreting the correlation between rotation and translation. We further deploy the proposed scheme for nonparametric modeling of estimates for nonlinear SE(2) estimation. Simulations show superior tracking performance of the sample reduction-based filter compared with Monte Carlo-based as well as parametric model-based planar dual quaternion filters

    Multiple Integrated Navigation Sensors for Improving Occupancy Grid FastSLAM

    Get PDF
    An autonomous vehicle must accurately observe its location within the environment to interact with objects and accomplish its mission. When its environment is unknown, the vehicle must construct a map detailing its surroundings while using it to maintain an accurate location. Such a vehicle is faced with the circularly defined Simultaneous Localization and Mapping (SLAM) problem. However difficult, SLAM is a critical component of autonomous vehicle exploration with applications to search and rescue. To current knowledge, this research presents the first SLAM solution to integrate stereo cameras, inertial measurements, and vehicle odometry into a Multiple Integrated Navigation Sensor (MINS) path. The implementation combines the MINS path with LIDAR to observe and map the environment using the FastSLAM algorithm. In real-world tests, a mobile ground vehicle equipped with these sensors completed a 140 meter loop around indoor hallways. This SLAM solution produces a path that closes the loop and remains within 1 meter of truth, reducing the error 92% from an image-inertial navigation system and 79% from odometry FastSLAM

    On the Adaptivity of Unscented Particle Filter for GNSS/INS Tightly-Integrated Navigation Unit in Urban Environment

    Get PDF
    Tight integration algorithms fusing Global Navigation Satellite System (GNSS) and Inertial Navigation System (INS) have become popular in many high-accuracy positioning and navigation applications. Despite their reliability, common integration architectures can still run into accuracy drops under challenging navigation settings. The growing computational power of low-cost, embedded systems has allowed for the exploitation of several advanced Bayesian state estimation algorithms, such as the Particle Filter (PF) and its hybrid variants, e.g. Unscented Particle Filter (UPF). Although sophisticated, these architectures are not immune from multipath scattering and Non-Line-of-Sight (NLOS) signal receptions, which frequently corrupt satellite measurements and jeopardise GNSS/INS solutions. Hence, a certain level of modelling adaptivity should be granted to avoid severe drifts in the estimated states. Given these premises, the paper presents a novel Adaptive Unscented Particle Filter (AUPF) architecture leveraging two cascading stages to cope with disruptive, biased GNSS input observables in harsh conditions. A INS-based signal processing block is implemented upstream of a Redundant Measurement Noise Covariance Estimation (RMNCE) stage to strengthen the adaptation of observables’ statistics and improve the state estimation. An experimental assessment is provided for the proposed robust AUPF that demonstrates a 10 % average reduction of the horizontal position error above the 75-th percentile. In addition, a comparative analysis both with previous adaptive architectures and a plain UPF is carried out to highlight the improved performance of the proposed methodology

    A Comprehensive Introduction of Visual-Inertial Navigation

    Full text link
    In this article, a tutorial introduction to visual-inertial navigation(VIN) is presented. Visual and inertial perception are two complementary sensing modalities. Cameras and inertial measurement units (IMU) are the corresponding sensors for these two modalities. The low cost and light weight of camera-IMU sensor combinations make them ubiquitous in robotic navigation. Visual-inertial Navigation is a state estimation problem, that estimates the ego-motion and local environment of the sensor platform. This paper presents visual-inertial navigation in the classical state estimation framework, first illustrating the estimation problem in terms of state variables and system models, including related quantities representations (Parameterizations), IMU dynamic and camera measurement models, and corresponding general probabilistic graphical models (Factor Graph). Secondly, we investigate the existing model-based estimation methodologies, these involve filter-based and optimization-based frameworks and related on-manifold operations. We also discuss the calibration of some relevant parameters, also initialization of state of interest in optimization-based frameworks. Then the evaluation and improvement of VIN in terms of accuracy, efficiency, and robustness are discussed. Finally, we briefly mention the recent development of learning-based methods that may become alternatives to traditional model-based methods.Comment: 35 pages, 10 figure

    Particle filter theory and practice with positioning applications

    Full text link

    Inertial navigation aided by simultaneous loacalization and mapping

    Get PDF
    Unmanned aerial vehicles technologies are getting smaller and cheaper to use and the challenges of payload limitation in unmanned aerial vehicles are being overcome. Integrated navigation system design requires selection of set of sensors and computation power that provides reliable and accurate navigation parameters (position, velocity and attitude) with high update rates and bandwidth in small and cost effective manner. Many of today’s operational unmanned aerial vehicles navigation systems rely on inertial sensors as a primary measurement source. Inertial Navigation alone however suffers from slow divergence with time. This divergence is often compensated for by employing some additional source of navigation information external to Inertial Navigation. From the 1990’s to the present day Global Positioning System has been the dominant navigation aid for Inertial Navigation. In a number of scenarios, Global Positioning System measurements may be completely unavailable or they simply may not be precise (or reliable) enough to be used to adequately update the Inertial Navigation hence alternative methods have seen great attention. Aiding Inertial Navigation with vision sensors has been the favoured solution over the past several years. Inertial and vision sensors with their complementary characteristics have the potential to answer the requirements for reliable and accurate navigation parameters. In this thesis we address Inertial Navigation position divergence. The information for updating the position comes from combination of vision and motion. When using such a combination many of the difficulties of the vision sensors (relative depth, geometry and size of objects, image blur and etc.) can be circumvented. Motion grants the vision sensors with many cues that can help better to acquire information about the environment, for instance creating a precise map of the environment and localize within the environment. We propose changes to the Simultaneous Localization and Mapping augmented state vector in order to take repeated measurements of the map point. We show that these repeated measurements with certain manoeuvres (motion) around or by the map point are crucial for constraining the Inertial Navigation position divergence (bounded estimation error) while manoeuvring in vicinity of the map point. This eliminates some of the uncertainty of the map point estimates i.e. it reduces the covariance of the map points estimates. This concept brings different parameterization (feature initialisation) of the map points in Simultaneous Localization and Mapping and we refer to it as concept of aiding Inertial Navigation by Simultaneous Localization and Mapping. We show that making such an integrated navigation system requires coordination with the guidance and control measurements and the vehicle task itself for performing the required vehicle manoeuvres (motion) and achieving better navigation accuracy. This fact brings new challenges to the practical design of these modern jam proof Global Positioning System free autonomous navigation systems. Further to the concept of aiding Inertial Navigation by Simultaneous Localization and Mapping we have investigated how a bearing only sensor such as single camera can be used for aiding Inertial Navigation. The results of the concept of Inertial Navigation aided by Simultaneous Localization and Mapping were used. New parameterization of the map point in Bearing Only Simultaneous Localization and Mapping is proposed. Because of the number of significant problems that appear when implementing the Extended Kalman Filter in Inertial Navigation aided by Bearing Only Simultaneous Localization and Mapping other algorithms such as Iterated Extended Kalman Filter, Unscented Kalman Filter and Particle Filters were implemented. From the results obtained, the conclusion can be drawn that the nonlinear filters should be the choice of estimators for this application

    A SLAM Algorithm Based on Adaptive Cubature Kalman Filter

    Get PDF
    We need to predict mathematical model of the system and a priori knowledge of the noise statistics when traditional simultaneous localization and mapping (SLAM) solutions are used. However, in many practical applications, prior statistics of the noise are unknown or time-varying, which will lead to large estimation errors or even cause divergence. In order to solve the above problem, an innovative cubature Kalman filter-based SLAM (CKF-SLAM) algorithm based on an adaptive cubature Kalman filter (ACKF) was established in this paper. The novel algorithm estimates the statistical parameters of the unknown system noise by introducing the Sage-Husa noise statistic estimator. Combining the advantages of the CKF-SLAM and the adaptive estimator, the new ACKF-SLAM algorithm can reduce the state estimated error significantly and improve the navigation accuracy of the SLAM system effectively. The performance of this new algorithm has been examined through numerical simulations in different scenarios. The results have shown that the position error can be effectively reduced with the new adaptive CKF-SLAM algorithm. Compared with other traditional SLAM methods, the accuracy of the nonlinear SLAM system is significantly improved. It verifies that the proposed ACKF-SLAM algorithm is valid and feasible
    corecore