149 research outputs found

    Stereo Visual SLAM Based on Unscented Dual Quaternion Filtering

    Get PDF

    CES-515 Towards Localization and Mapping of Autonomous Underwater Vehicles: A Survey

    Get PDF
    Autonomous Underwater Vehicles (AUVs) have been used for a huge number of tasks ranging from commercial, military and research areas etc, while the fundamental function of a successful AUV is its localization and mapping ability. This report aims to review the relevant elements of localization and mapping for AUVs. First, a brief introduction of the concept and the historical development of AUVs is given; then a relatively detailed description of the sensor system used for AUV navigation is provided. As the main part of the report, a comprehensive investigation of the simultaneous localization and mapping (SLAM) for AUVs are conducted, including its application examples. Finally a brief conclusion is summarized

    Theory, Design, and Implementation of Landmark Promotion Cooperative Simultaneous Localization and Mapping

    Get PDF
    Simultaneous Localization and Mapping (SLAM) is a challenging problem in practice, the use of multiple robots and inexpensive sensors poses even more demands on the designer. Cooperative SLAM poses specific challenges in the areas of computational efficiency, software/network performance, and robustness to errors. New methods in image processing, recursive filtering, and SLAM have been developed to implement practical algorithms for cooperative SLAM on a set of inexpensive robots. The Consolidated Unscented Mixed Recursive Filter (CUMRF) is designed to handle non-linear systems with non-Gaussian noise. This is accomplished using the Unscented Transform combined with Gaussian Mixture Models. The Robust Kalman Filter is an extension of the Kalman Filter algorithm that improves the ability to remove erroneous observations using Principal Component Analysis (PCA) and the X84 outlier rejection rule. Forgetful SLAM is a local SLAM technique that runs in nearly constant time relative to the number of visible landmarks and improves poor performing sensors through sensor fusion and outlier rejection. Forgetful SLAM correlates all measured observations, but stops the state from growing over time. Hierarchical Active Ripple SLAM (HAR-SLAM) is a new SLAM architecture that breaks the traditional state space of SLAM into a chain of smaller state spaces, allowing multiple robots, multiple sensors, and multiple updates to occur in linear time with linear storage with respect to the number of robots, landmarks, and robots poses. This dissertation presents explicit methods for closing-the-loop, joining multiple robots, and active updates. Landmark Promotion SLAM is a hierarchy of new SLAM methods, using the Robust Kalman Filter, Forgetful SLAM, and HAR-SLAM. Practical aspects of SLAM are a focus of this dissertation. LK-SURF is a new image processing technique that combines Lucas-Kanade feature tracking with Speeded-Up Robust Features to perform spatial and temporal tracking. Typical stereo correspondence techniques fail at providing descriptors for features, or fail at temporal tracking. Several calibration and modeling techniques are also covered, including calibrating stereo cameras, aligning stereo cameras to an inertial system, and making neural net system models. These methods are important to improve the quality of the data and images acquired for the SLAM process

    A scalable, portable, FPGA-based implementation of the Unscented Kalman Filter

    Get PDF
    Sustained technological progress has come to a point where robotic/autonomous systems may well soon become ubiquitous. In order for these systems to actually be useful, an increase in autonomous capability is necessary for aerospace, as well as other, applications. Greater aerospace autonomous capability means there is a need for high performance state estimation. However, the desire to reduce costs through simplified development processes and compact form factors can limit performance. A hardware-based approach, such as using a Field Programmable Gate Array (FPGA), is common when high performance is required, but hardware approaches tend to have a more complicated development process when compared to traditional software approaches; greater development complexity, in turn, results in higher costs. Leveraging the advantages of both hardware-based and software-based approaches, a hardware/software (HW/SW) codesign of the Unscented Kalman Filter (UKF), based on an FPGA, is presented. The UKF is split into an application-specific part, implemented in software to retain portability, and a non-application-specific part, implemented in hardware as a parameterisable IP core to increase performance. The codesign is split into three versions (Serial, Parallel and Pipeline) to provide flexibility when choosing the balance between resources and performance, allowing system designers to simplify the development process. Simulation results demonstrating two possible implementations of the design, a nanosatellite application and a Simultaneous Localisation and Mapping (SLAM) application, are presented. These results validate the performance of the HW/SW UKF and demonstrate its portability, particularly in small aerospace systems. Implementation (synthesis, timing, power) details for a variety of situations are presented and analysed to demonstrate how the HW/SW codesign can be scaled for any application

    Optimization of a Simultaneous Localization and Mapping (SLAM) System for an Autonomous Vehicle Using a 2-Dimensional Light Detection and Ranging Sensor (LiDAR) by Sensor Fusion

    Get PDF
    Fully autonomous vehicles must accurately estimate the extent of their environment as well as their relative location in their environment. A popular approach to organizing such information is creating a map of a given physical environment and defining a point in this map representing the vehicle’s location. Simultaneous Mapping and Localization (SLAM) is a computing algorithm that takes inputs from a Light Detection and Ranging (LiDAR) sensor to construct a map of the vehicle’s physical environment and determine its respective location in this map based on feature recognition simultaneously. Two fundamental requirements allow an accurate SLAM method: one being accurate distance measurements and the second being an accurate assessment of location. Researched are methods in which a 2D LiDAR sensor system with laser range finders, ultrasonic sensors and stereo camera vision is optimized for distance measurement accuracy, particularly a method using recurrent neural networks. Sensor fusion techniques with infrared, camera and ultrasonic sensors are implemented to investigate their effects on distance measurement accuracy. It was found that the use of a recurrent neural network for fusing data from a 2D LiDAR with laser range finders and ultrasonic sensors outperforms raw sensor data in accuracy (46.6% error reduced to 3.0% error) and precision (0.62m std. deviation reduced to 0.0015m std. deviation). These results demonstrate the effectiveness of machine learning based fusion algorithms for noise reduction, measurement accuracy improvement, and outlier measurement removal which would provide SLAM vehicles more robust performance

    Sensitivity analysis of a relative navigation solution for unmanned aerial vehicles in a GNSS-denied environment

    Get PDF
    Cooperative navigation between two or more unmanned aerial vehicles (UAVs) is an important enabling technology for problems such as military reconnaissance, disaster response, and search and rescue. In many of these situations Global Navigation Satellite Systems (GNSS), such as Global Positioning System (GPS), may be unreliable or unavailable due to structural impedance or malicious signal jamming. Therefore, the task of maintaining a reliable relative navigation solution without the use of GNSS is an important need for the aforementioned missions.;To meet this need, this thesis focuses on the relative navigation between two UAVs that are operating in a GNSS-denied environment. In particular, the design and sensitivity of a navigation algorithm are presented. The navigation algorithm presented consists of an Unscented Kalman filter that fuses multiple on-board sensors to estimate the relative pose between two UAVs. These sensors include: strap-down inertial measurement units, ultra-wideband ranging radios, strap-down tri-axial magnetometers, and downward facing cameras. Through the use of a Monte Carlo simulation study, the presented algorithm\u27s performance sensitivity to various sensor payload characteristics, flight dynamics, and initial condition errors is evaluated. Additionally, a research platform that will provide for a future experimental evaluation of the algorithm presented in this thesis has been integrated and tested as part of this work

    Frequency Modulated Continuous Wave Radar and Video Fusion for Simultaneous Localization and Mapping

    Get PDF
    There has been a push recently to develop technology to enable the use of UAVs in GPS-denied environments. As UAVs become smaller, there is a need to reduce the number and sizes of sensor systems on board. A video camera on a UAV can serve multiple purposes. It can return imagery for processing by human users. The highly accurate bearing information provided by video makes it a useful tool to be incorporated into a navigation and tracking system. Radars can provide information about the types of objects in a scene and can operate in adverse weather conditions. The range and velocity measurements provided by the radar make it a good tool for navigation. FMCW radar and color video were fused to perform SLAM in an outdoor environment. A radar SLAM solution provided the basis for the fusion. Correlations between radar returns were used to estimate dead-reckoning parameters to obtain an estimate of the platform location. A new constraint was added in the radar detection process to prevent detecting poorly observable reflectors while maintaining a large number of measurements on highly observable reflectors. The radar measurements were mapped as landmarks, further improving the platform location estimates. As images were received from the video camera, changes in platform orientation were estimated, further improving the platform orientation estimates. The expected locations of radar measurements, whose uncertainty was modeled as Gaussian, were projected onto the images and used to estimate the location of the radar reflector in the image. The colors of the most likely reflector were saved and used to detect the reflector in subsequent images. The azimuth angles obtained from the image detections were used to improve the estimates of the landmarks in the SLAM map over previous estimates where only the radar was used

    Visual SLAM from image sequences acquired by unmanned aerial vehicles

    Get PDF
    This thesis shows that Kalman filter based approaches are sufficient for the task of simultaneous localization and mapping from image sequences acquired by unmanned aerial vehicles. Using solely direction measurements to solve the problem of simultaneous localization and mapping (SLAM) is an important part of autonomous systems. Because the need for real-time capable systems, recursive estimation techniques, Kalman filter based approaches are the main focus of interest. Unfortunately, the non-linearity of the triangulation using the direction measurements cause decrease of accuracy and consistency of the results. The first contribution of this work is a general derivation of the recursive update of the Kalman filter. This derivation is based on implicit measurement equations, having the classical iterative non-linear as well as the non-iterative and linear Kalman filter as specializations of our general derivation. Second, a new formulation of linear-motion models for the single camera state model and the sliding window camera state model are given, that make it possible to compute the prediction in a fully linear manner. The third major contribution is a novel method for the initialization of new object points in the Kalman filter. Empirical studies using synthetic and real data of an image sequence of a photogrammetric strip are made, that demonstrate and compare the influences of the initialization methods of new object points in the Kalman filter. Forth, the accuracy potential of monoscopic image sequences from unmanned aerial vehicles for autonomous localization and mapping is theoretically analyzed, which can be used for planning purposes.Visuelle gleichzeitige Lokalisierung und Kartierung aus Bildfolgen von unbemannten Flugkörpern Diese Arbeit zeigt, dass die Kalmanfilter basierte Lösung der Triangulation zur Lokalisierung und Kartierung aus Bildfolgen von unbemannten Flugkörpern realisierbar ist. Aufgrund von Echtzeitanforderungen autonomer Systeme erreichen rekursive Schätz-verfahren, insbesondere Kalmanfilter basierte Ansätze, große Beliebheit. Bedauerlicherweise treten dabei durch die Nichtlinearität der Triangulation einige Effekte auf, welche die Konsistenz und Genauigkeit der Lösung hinsichtlich der geschätzten Parameter maßgeblich beeinflussen. Der erste Beitrag dieser Arbeit besteht in der Herleitung eines generellen Verfahrens zum rekursiven Verbessern im Kalmanfilter mit impliziten Beobachtungsgleichungen. Wir zeigen, dass die klassischen Verfahren im Kalmanfilter eine Spezialisierung unseres Ansatzes darstellen. Im zweiten Beitrag erweitern wir die klassische Modellierung für ein Einkameramodell zu einem Mehrkameramodell im Kalmanfilter. Diese Erweiterung erlaubt es uns, die Prädiktion für eine lineares Bewegungsmodell vollkommen linear zu berechnen. In einem dritten Hauptbeitrag stellen wir ein neues Verfahren zur Initialisierung von Neupunkten im Kalmanfilter vor. Anhand von empirischen Untersuchungen unter Verwendung simulierter und realer Daten einer Bildfolge eines photogrammetrischen Streifens zeigen und vergleichen wir, welchen Einfluß die Initialisierungsmethoden für Neupunkte im Kalmanfilter haben und welche Genauigkeiten für diese Szenarien erreichbar sind. Am Beispiel von Bildfolgen eines unbemannten Flugkörpern zeigen wir in dieser Arbeit als vierten Beitrag, welche Genauigkeit zur Lokalisierung und Kartierung durch Triangulation möglich ist. Diese theoretische Analyse kann wiederum zu Planungszwecken verwendet werden

    Integrated WiFi/PDR/Smartphone using an unscented Kalman filter algorithm for 3D indoor localization

    Get PDF
    Because of the high calculation cost and poor performance of a traditional planar map when dealing with complicated indoor geographic information, a WiFi fingerprint indoor positioning system cannot be widely employed on a smartphone platform. By making full use of the hardware sensors embedded in the smartphone, this study proposes an integrated approach to a three-dimensional (3D) indoor positioning system. First, an improved K-means clustering method is adopted to reduce the fingerprint database retrieval time and enhance positioning efficiency. Next, with the mobile phone’s acceleration sensor, a new step counting method based on auto-correlation analysis is proposed to achieve cell phone inertial navigation positioning. Furthermore, the integration of WiFi positioning with Pedestrian Dead Reckoning (PDR) obtains higher positional accuracy with the help of the Unscented Kalman Filter algorithm. Finally, a hybrid 3D positioning system based on Unity 3D, which can carry out real-time positioning for targets in 3D scenes, is designed for the fluent operation of mobile terminals
    • …
    corecore