880 research outputs found
Recommended from our members
Localization from semantic observations via the matrix permanent
Most approaches to robot localization rely on low-level geometric features such as points, lines, and planes. In this paper, we use object recognition to obtain semantic information from the robot’s sensors and consider the task of localizing the robot within a prior map of landmarks, which are annotated with semantic labels. As object recognition algorithms miss detections and produce false alarms, correct data association between the detections and the landmarks on the map is central to the semantic localization problem. Instead of the traditional vector-based representation, we propose a sensor model, which encodes the semantic observations via random finite sets and enables a unified treatment of missed detections, false alarms, and data association. Our second contribution is to reduce the problem of computing the likelihood of a set-valued observation to the problem of computing a matrix permanent. It is this crucial transformation that allows us to solve the semantic localization problem with a polynomial-time approximation to the set-based Bayes filter. Finally, we address the active semantic localization problem, in which the observer’s trajectory is planned in order to improve the accuracy and efficiency of the localization process. The performance of our approach is demonstrated in simulation and in real environments using deformable-part-model-based object detectors. Robust global localization from semantic observations is demonstrated for a mobile robot, for the Project Tango phone, and on the KITTI visual odometry dataset. Comparisons are made with the traditional lidar-based geometric Monte Carlo localization
Localization in urban environments. A hybrid interval-probabilistic method
Ensuring safety has become a paramount concern with the increasing autonomy of vehicles and the advent of autonomous driving. One of the most fundamental tasks of increased autonomy is localization, which is essential for safe operation. To quantify safety requirements, the concept of integrity has been introduced in aviation, based on the ability of the system to provide timely and correct alerts when the safe operation of the systems can no longer be guaranteed. Therefore, it is necessary to assess the localization's uncertainty to determine the system's operability.
In the literature, probability and set-membership theory are two predominant approaches that provide mathematical tools to assess uncertainty. Probabilistic approaches often provide accurate point-valued results but tend to underestimate the uncertainty. Set-membership approaches reliably estimate the uncertainty but can be overly pessimistic, producing inappropriately large uncertainties and no point-valued results. While underestimating the uncertainty can lead to misleading information and dangerous system failure without warnings, overly pessimistic uncertainty estimates render the system inoperative for practical purposes as warnings are fired more often.
This doctoral thesis aims to study the symbiotic relationship between set-membership-based and probabilistic localization approaches and combine them into a unified hybrid localization approach. This approach enables safe operation while not being overly pessimistic regarding the uncertainty estimation. In the scope of this work, a novel Hybrid Probabilistic- and Set-Membership-based Coarse and Refined (HyPaSCoRe) Localization method is introduced. This method localizes a robot in a building map in real-time and considers two types of hybridizations. On the one hand, set-membership approaches are used to robustify and control probabilistic approaches. On the other hand, probabilistic approaches are used to reduce the pessimism of set-membership approaches by augmenting them with further probabilistic constraints.
The method consists of three modules - visual odometry, coarse localization, and refined localization. The HyPaSCoRe Localization uses a stereo camera system, a LiDAR sensor, and GNSS data, focusing on localization in urban canyons where GNSS data can be inaccurate. The visual odometry module computes the relative motion of the vehicle. In contrast, the coarse localization module uses set-membership approaches to narrow down the feasible set of poses and provides the set of most likely poses inside the feasible set using a probabilistic approach. The refined localization module further refines the coarse localization result by reducing the pessimism of the uncertainty estimate by incorporating probabilistic constraints into the set-membership approach.
The experimental evaluation of the HyPaSCoRe shows that it maintains the integrity of the uncertainty estimation while providing accurate, most likely point-valued solutions in real-time. Introducing this new hybrid localization approach contributes to developing safe and reliable algorithms in the context of autonomous driving
Probabilistic constraint reasoning with Monte Carlo integration to robot localization
This work studies the combination of safe and probabilistic reasoning through the hybridization of Monte Carlo integration techniques with continuous constraint programming. In continuous constraint programming there are variables ranging over continuous domains (represented as intervals) together with constraints over them (relations between variables) and the goal is to find values for those variables that satisfy all the constraints (consistent scenarios). Constraint programming “branch-and-prune” algorithms produce safe enclosures of all consistent scenarios. Special proposed algorithms for probabilistic constraint reasoning compute the probability of sets of consistent scenarios which imply the calculation of an integral over these sets (quadrature). In this work we propose to extend the “branch-and-prune” algorithms with Monte Carlo integration techniques to compute such probabilities. This approach can be useful in robotics for localization problems. Traditional approaches are based on probabilistic techniques that search the most likely scenario, which may not satisfy the model constraints. We show how to apply our approach in order to cope with this problem and provide functionality in real time
Computational intelligence approaches to robotics, automation, and control [Volume guest editors]
No abstract available
CES-515 Towards Localization and Mapping of Autonomous Underwater Vehicles: A Survey
Autonomous Underwater Vehicles (AUVs) have been used for a huge number of tasks ranging from commercial, military and research areas etc, while the fundamental function of a successful AUV is its localization and mapping ability. This report aims to review the relevant elements of localization and mapping for AUVs. First, a brief introduction of the concept and the historical development of AUVs is given; then a relatively detailed description of the sensor system used for AUV navigation is provided. As the main part of the report, a comprehensive investigation of the simultaneous localization and mapping (SLAM) for AUVs are conducted, including its application examples. Finally a brief conclusion is summarized
Sparse Bayesian information filters for localization and mapping
Submitted in partial fulfillment of the requirements for the degree of Doctor of Philosophy at the Massachusetts Institute of Technology and the Woods Hole Oceanographic Institution February 2008This thesis formulates an estimation framework for Simultaneous Localization and
Mapping (SLAM) that addresses the problem of scalability in large environments.
We describe an estimation-theoretic algorithm that achieves significant gains in computational
efficiency while maintaining consistent estimates for the vehicle pose and
the map of the environment.
We specifically address the feature-based SLAM problem in which the robot represents
the environment as a collection of landmarks. The thesis takes a Bayesian
approach whereby we maintain a joint posterior over the vehicle pose and feature
states, conditioned upon measurement data. We model the distribution as Gaussian
and parametrize the posterior in the canonical form, in terms of the information
(inverse covariance) matrix. When sparse, this representation is amenable to computationally
efficient Bayesian SLAM filtering. However, while a large majority of the
elements within the normalized information matrix are very small in magnitude, it is
fully populated nonetheless. Recent feature-based SLAM filters achieve the scalability
benefits of a sparse parametrization by explicitly pruning these weak links in an effort
to enforce sparsity. We analyze one such algorithm, the Sparse Extended Information
Filter (SEIF), which has laid much of the groundwork concerning the computational
benefits of the sparse canonical form. The thesis performs a detailed analysis of the
process by which the SEIF approximates the sparsity of the information matrix and
reveals key insights into the consequences of different sparsification strategies. We
demonstrate that the SEIF yields a sparse approximation to the posterior that is inconsistent,
suffering from exaggerated confidence estimates. This overconfidence has
detrimental effects on important aspects of the SLAM process and affects the higher
level goal of producing accurate maps for subsequent localization and path planning.
This thesis proposes an alternative scalable filter that maintains sparsity while
preserving the consistency of the distribution. We leverage insights into the natural
structure of the feature-based canonical parametrization and derive a method that
actively maintains an exactly sparse posterior. Our algorithm exploits the structure
of the parametrization to achieve gains in efficiency, with a computational cost that
scales linearly with the size of the map. Unlike similar techniques that sacrifice
consistency for improved scalability, our algorithm performs inference over a posterior
that is conservative relative to the nominal Gaussian distribution. Consequently, we
preserve the consistency of the pose and map estimates and avoid the effects of an
overconfident posterior.
We demonstrate our filter alongside the SEIF and the standard EKF both in simulation
as well as on two real-world datasets. While we maintain the computational
advantages of an exactly sparse representation, the results show convincingly that
our method yields conservative estimates for the robot pose and map that are nearly
identical to those of the original Gaussian distribution as produced by the EKF, but
at much less computational expense.
The thesis concludes with an extension of our SLAM filter to a complex underwater
environment. We describe a systems-level framework for localization and mapping
relative to a ship hull with an Autonomous Underwater Vehicle (AUV) equipped
with a forward-looking sonar. The approach utilizes our filter to fuse measurements
of vehicle attitude and motion from onboard sensors with data from sonar images of
the hull. We employ the system to perform three-dimensional, 6-DOF SLAM on a
ship hull
Probablistic approaches for intelligent AUV localisation
This thesis studies the problem of intelligent localisation for an autonomous underwater
vehicle (AUV). After an introduction about robot localisation and specific
issues in the underwater domain, the thesis will focus on passive techniques for AUV
localisation, highlighting experimental results and comparison among different techniques.
Then, it will develop active techniques, which require intelligent decisions
about the steps to undertake in order for the AUV to localise itself. The undertaken
methodology consisted in three stages: theoretical analysis of the problem, tests with
a simulation environment, integration in the robot architecture and field trials. The
conclusions highlight applications and scenarios where the developed techniques have
been successfully used or can be potentially used to enhance the results given by current
techniques. The main contribution of this thesis is in the proposal of an active
localisation module, which is able to determine the best set of action to be executed,
in order to maximise the localisation results, in terms of time and efficiency
- …