1,364 research outputs found

    CES-515 Towards Localization and Mapping of Autonomous Underwater Vehicles: A Survey

    Get PDF
    Autonomous Underwater Vehicles (AUVs) have been used for a huge number of tasks ranging from commercial, military and research areas etc, while the fundamental function of a successful AUV is its localization and mapping ability. This report aims to review the relevant elements of localization and mapping for AUVs. First, a brief introduction of the concept and the historical development of AUVs is given; then a relatively detailed description of the sensor system used for AUV navigation is provided. As the main part of the report, a comprehensive investigation of the simultaneous localization and mapping (SLAM) for AUVs are conducted, including its application examples. Finally a brief conclusion is summarized

    A review of sensor technology and sensor fusion methods for map-based localization of service robot

    Get PDF
    Service robot is currently gaining traction, particularly in hospitality, geriatric care and healthcare industries. The navigation of service robots requires high adaptability, flexibility and reliability. Hence, map-based navigation is suitable for service robot because of the ease in updating changes in environment and the flexibility in determining a new optimal path. For map-based navigation to be robust, an accurate and precise localization method is necessary. Localization problem can be defined as recognizing the robot’s own position in a given environment and is a crucial step in any navigational process. Major difficulties of localization include dynamic changes of the real world, uncertainties and limited sensor information. This paper presents a comparative review of sensor technology and sensor fusion methods suitable for map-based localization, focusing on service robot applications

    Visual SLAM from image sequences acquired by unmanned aerial vehicles

    Get PDF
    This thesis shows that Kalman filter based approaches are sufficient for the task of simultaneous localization and mapping from image sequences acquired by unmanned aerial vehicles. Using solely direction measurements to solve the problem of simultaneous localization and mapping (SLAM) is an important part of autonomous systems. Because the need for real-time capable systems, recursive estimation techniques, Kalman filter based approaches are the main focus of interest. Unfortunately, the non-linearity of the triangulation using the direction measurements cause decrease of accuracy and consistency of the results. The first contribution of this work is a general derivation of the recursive update of the Kalman filter. This derivation is based on implicit measurement equations, having the classical iterative non-linear as well as the non-iterative and linear Kalman filter as specializations of our general derivation. Second, a new formulation of linear-motion models for the single camera state model and the sliding window camera state model are given, that make it possible to compute the prediction in a fully linear manner. The third major contribution is a novel method for the initialization of new object points in the Kalman filter. Empirical studies using synthetic and real data of an image sequence of a photogrammetric strip are made, that demonstrate and compare the influences of the initialization methods of new object points in the Kalman filter. Forth, the accuracy potential of monoscopic image sequences from unmanned aerial vehicles for autonomous localization and mapping is theoretically analyzed, which can be used for planning purposes.Visuelle gleichzeitige Lokalisierung und Kartierung aus Bildfolgen von unbemannten Flugkörpern Diese Arbeit zeigt, dass die Kalmanfilter basierte Lösung der Triangulation zur Lokalisierung und Kartierung aus Bildfolgen von unbemannten Flugkörpern realisierbar ist. Aufgrund von Echtzeitanforderungen autonomer Systeme erreichen rekursive Schätz-verfahren, insbesondere Kalmanfilter basierte Ansätze, große Beliebheit. Bedauerlicherweise treten dabei durch die Nichtlinearität der Triangulation einige Effekte auf, welche die Konsistenz und Genauigkeit der Lösung hinsichtlich der geschätzten Parameter maßgeblich beeinflussen. Der erste Beitrag dieser Arbeit besteht in der Herleitung eines generellen Verfahrens zum rekursiven Verbessern im Kalmanfilter mit impliziten Beobachtungsgleichungen. Wir zeigen, dass die klassischen Verfahren im Kalmanfilter eine Spezialisierung unseres Ansatzes darstellen. Im zweiten Beitrag erweitern wir die klassische Modellierung für ein Einkameramodell zu einem Mehrkameramodell im Kalmanfilter. Diese Erweiterung erlaubt es uns, die Prädiktion für eine lineares Bewegungsmodell vollkommen linear zu berechnen. In einem dritten Hauptbeitrag stellen wir ein neues Verfahren zur Initialisierung von Neupunkten im Kalmanfilter vor. Anhand von empirischen Untersuchungen unter Verwendung simulierter und realer Daten einer Bildfolge eines photogrammetrischen Streifens zeigen und vergleichen wir, welchen Einfluß die Initialisierungsmethoden für Neupunkte im Kalmanfilter haben und welche Genauigkeiten für diese Szenarien erreichbar sind. Am Beispiel von Bildfolgen eines unbemannten Flugkörpern zeigen wir in dieser Arbeit als vierten Beitrag, welche Genauigkeit zur Lokalisierung und Kartierung durch Triangulation möglich ist. Diese theoretische Analyse kann wiederum zu Planungszwecken verwendet werden

    Theory, Design, and Implementation of Landmark Promotion Cooperative Simultaneous Localization and Mapping

    Get PDF
    Simultaneous Localization and Mapping (SLAM) is a challenging problem in practice, the use of multiple robots and inexpensive sensors poses even more demands on the designer. Cooperative SLAM poses specific challenges in the areas of computational efficiency, software/network performance, and robustness to errors. New methods in image processing, recursive filtering, and SLAM have been developed to implement practical algorithms for cooperative SLAM on a set of inexpensive robots. The Consolidated Unscented Mixed Recursive Filter (CUMRF) is designed to handle non-linear systems with non-Gaussian noise. This is accomplished using the Unscented Transform combined with Gaussian Mixture Models. The Robust Kalman Filter is an extension of the Kalman Filter algorithm that improves the ability to remove erroneous observations using Principal Component Analysis (PCA) and the X84 outlier rejection rule. Forgetful SLAM is a local SLAM technique that runs in nearly constant time relative to the number of visible landmarks and improves poor performing sensors through sensor fusion and outlier rejection. Forgetful SLAM correlates all measured observations, but stops the state from growing over time. Hierarchical Active Ripple SLAM (HAR-SLAM) is a new SLAM architecture that breaks the traditional state space of SLAM into a chain of smaller state spaces, allowing multiple robots, multiple sensors, and multiple updates to occur in linear time with linear storage with respect to the number of robots, landmarks, and robots poses. This dissertation presents explicit methods for closing-the-loop, joining multiple robots, and active updates. Landmark Promotion SLAM is a hierarchy of new SLAM methods, using the Robust Kalman Filter, Forgetful SLAM, and HAR-SLAM. Practical aspects of SLAM are a focus of this dissertation. LK-SURF is a new image processing technique that combines Lucas-Kanade feature tracking with Speeded-Up Robust Features to perform spatial and temporal tracking. Typical stereo correspondence techniques fail at providing descriptors for features, or fail at temporal tracking. Several calibration and modeling techniques are also covered, including calibrating stereo cameras, aligning stereo cameras to an inertial system, and making neural net system models. These methods are important to improve the quality of the data and images acquired for the SLAM process

    Inertial navigation aided by simultaneous loacalization and mapping

    Get PDF
    Unmanned aerial vehicles technologies are getting smaller and cheaper to use and the challenges of payload limitation in unmanned aerial vehicles are being overcome. Integrated navigation system design requires selection of set of sensors and computation power that provides reliable and accurate navigation parameters (position, velocity and attitude) with high update rates and bandwidth in small and cost effective manner. Many of today’s operational unmanned aerial vehicles navigation systems rely on inertial sensors as a primary measurement source. Inertial Navigation alone however suffers from slow divergence with time. This divergence is often compensated for by employing some additional source of navigation information external to Inertial Navigation. From the 1990’s to the present day Global Positioning System has been the dominant navigation aid for Inertial Navigation. In a number of scenarios, Global Positioning System measurements may be completely unavailable or they simply may not be precise (or reliable) enough to be used to adequately update the Inertial Navigation hence alternative methods have seen great attention. Aiding Inertial Navigation with vision sensors has been the favoured solution over the past several years. Inertial and vision sensors with their complementary characteristics have the potential to answer the requirements for reliable and accurate navigation parameters. In this thesis we address Inertial Navigation position divergence. The information for updating the position comes from combination of vision and motion. When using such a combination many of the difficulties of the vision sensors (relative depth, geometry and size of objects, image blur and etc.) can be circumvented. Motion grants the vision sensors with many cues that can help better to acquire information about the environment, for instance creating a precise map of the environment and localize within the environment. We propose changes to the Simultaneous Localization and Mapping augmented state vector in order to take repeated measurements of the map point. We show that these repeated measurements with certain manoeuvres (motion) around or by the map point are crucial for constraining the Inertial Navigation position divergence (bounded estimation error) while manoeuvring in vicinity of the map point. This eliminates some of the uncertainty of the map point estimates i.e. it reduces the covariance of the map points estimates. This concept brings different parameterization (feature initialisation) of the map points in Simultaneous Localization and Mapping and we refer to it as concept of aiding Inertial Navigation by Simultaneous Localization and Mapping. We show that making such an integrated navigation system requires coordination with the guidance and control measurements and the vehicle task itself for performing the required vehicle manoeuvres (motion) and achieving better navigation accuracy. This fact brings new challenges to the practical design of these modern jam proof Global Positioning System free autonomous navigation systems. Further to the concept of aiding Inertial Navigation by Simultaneous Localization and Mapping we have investigated how a bearing only sensor such as single camera can be used for aiding Inertial Navigation. The results of the concept of Inertial Navigation aided by Simultaneous Localization and Mapping were used. New parameterization of the map point in Bearing Only Simultaneous Localization and Mapping is proposed. Because of the number of significant problems that appear when implementing the Extended Kalman Filter in Inertial Navigation aided by Bearing Only Simultaneous Localization and Mapping other algorithms such as Iterated Extended Kalman Filter, Unscented Kalman Filter and Particle Filters were implemented. From the results obtained, the conclusion can be drawn that the nonlinear filters should be the choice of estimators for this application

    Simultaneous localisation and mapping: A stereo vision based approach

    Get PDF
    With limited dynamic range and poor noise performance, cameras still pose considerable challenges in the application of range sensors in the context of robotic navigation, especially in the implementation of Simultaneous Localisation and Mapping (SLAM) with sparse features. This paper presents a combination of methods in solving the SLAM problem in a constricted indoor environment using small baseline stereo vision. Main contributions include a feature selection and tracking algorithm, a stereo noise filter, a robust feature validation algorithm and a multiple hypotheses adaptive window positioning method in 'closing the loop'. These methods take a novel approach in that information from the image processing and robotic navigation domains are used in tandem to augment each other. Experimental results including a real-time implementation in an office-like environment are also presented. © 2006 IEEE

    Optimization of a Simultaneous Localization and Mapping (SLAM) System for an Autonomous Vehicle Using a 2-Dimensional Light Detection and Ranging Sensor (LiDAR) by Sensor Fusion

    Get PDF
    Fully autonomous vehicles must accurately estimate the extent of their environment as well as their relative location in their environment. A popular approach to organizing such information is creating a map of a given physical environment and defining a point in this map representing the vehicle’s location. Simultaneous Mapping and Localization (SLAM) is a computing algorithm that takes inputs from a Light Detection and Ranging (LiDAR) sensor to construct a map of the vehicle’s physical environment and determine its respective location in this map based on feature recognition simultaneously. Two fundamental requirements allow an accurate SLAM method: one being accurate distance measurements and the second being an accurate assessment of location. Researched are methods in which a 2D LiDAR sensor system with laser range finders, ultrasonic sensors and stereo camera vision is optimized for distance measurement accuracy, particularly a method using recurrent neural networks. Sensor fusion techniques with infrared, camera and ultrasonic sensors are implemented to investigate their effects on distance measurement accuracy. It was found that the use of a recurrent neural network for fusing data from a 2D LiDAR with laser range finders and ultrasonic sensors outperforms raw sensor data in accuracy (46.6% error reduced to 3.0% error) and precision (0.62m std. deviation reduced to 0.0015m std. deviation). These results demonstrate the effectiveness of machine learning based fusion algorithms for noise reduction, measurement accuracy improvement, and outlier measurement removal which would provide SLAM vehicles more robust performance

    Towards autonomous localization and mapping of AUVs: a survey

    Get PDF
    Purpose The main purpose of this paper is to investigate two key elements of localization and mapping of Autonomous Underwater Vehicle (AUV), i.e. to overview various sensors and algorithms used for underwater localization and mapping, and to make suggestions for future research. Design/methodology/approach The authors first review various sensors and algorithms used for AUVs in the terms of basic working principle, characters, their advantages and disadvantages. The statistical analysis is carried out by studying 35 AUV platforms according to the application circumstances of sensors and algorithms. Findings As real-world applications have different requirements and specifications, it is necessary to select the most appropriate one by balancing various factors such as accuracy, cost, size, etc. Although highly accurate localization and mapping in an underwater environment is very difficult, more and more accurate and robust navigation solutions will be achieved with the development of both sensors and algorithms. Research limitations/implications This paper provides an overview of the state of art underwater localisation and mapping algorithms and systems. No experiments are conducted for verification. Practical implications The paper will give readers a clear guideline to find suitable underwater localisation and mapping algorithms and systems for their practical applications in hand. Social implications There is a wide range of audiences who will benefit from reading this comprehensive survey of autonomous localisation and mapping of UAVs. Originality/value The paper will provide useful information and suggestions to research students, engineers and scientists who work in the field of autonomous underwater vehicles
    corecore