517 research outputs found
Probabilistic Self-Localization and Mapping: An Asynchronous Multirate Approach
"© 2008 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works."[EN] In this paper, we present a set of robust and
efficient algorithms with O(N) cost for the solution of the
Simultaneous Localization And Mapping (SLAM) problem of
a mobile robot. First, we introduce a novel object detection
method, which is mainly based on multiple line fitting method
for landmark detection with regular constrained angles. Second,
a line-based pose estimation method is proposed, based on LeastSquares (LS). This method performs the matching of lines,
providing the global pose estimation under assumption of known
Data-Association. Finally, we extend the FastSLAM (FActored
Solution To SLAM) algorithm for mobile robot self-localisation
and mapping by considering the asynchronous sampling of
sensors and actuators. In this sense, multi-rate asynchronous
holds are used to interface signals with different sampling rates.
Moreover, an asynchronous fusion method to predict and update
mobile robot pose and map is also presented. In addition to
this, FastSLAM 1.0 has been also improved by considering the
estimated pose with the LS-approach to re-allocate each particle
of the posterior distribution of the robot pose. This approach has
a lower computational cost than the original Extended Kalman
Filtering (EKF) approach in FastSLAM 2.0. All these methods
have been combined in order to perform an efficient and robust
self-localization and map building process. Additionally, these
methods have been validated with experimental real data, in
mobile robot moving on an unknown environment for solving
the SLAM problem.This work has been supported by the Spanish Government (MCyT) research project BIA2005-09377-C03-02 and by the Italian Government (MIUR) research project PRIN2005097207.Armesto, L.; Ippoliti, G.; Longhi, S.; Tornero Montserrat, J. (2008). Probabilistic Self-Localization and Mapping: An Asynchronous Multirate Approach. IEEE Robotics & Automation Magazine. 15(2):77-88. https://doi.org/10.1109/M-RA.2007.907355S778815
A review of sensor technology and sensor fusion methods for map-based localization of service robot
Service robot is currently gaining traction, particularly in hospitality, geriatric care and healthcare industries. The navigation of service robots requires high adaptability, flexibility and reliability. Hence, map-based navigation is suitable for service robot because of the ease in updating changes in environment and the flexibility in determining a new optimal path. For map-based navigation to be robust, an accurate and precise localization method is necessary. Localization problem can be defined as recognizing the robot’s own position in a given environment and is a crucial step in any navigational process. Major difficulties of localization include dynamic changes of the real world, uncertainties and limited sensor information. This paper presents a comparative review of sensor technology and sensor fusion methods suitable for map-based localization, focusing on service robot applications
Split Covariance Intersection Filter Based Visual Localization With Accurate AprilTag Map For Warehouse Robot Navigation
Accurate and efficient localization with conveniently-established map is the
fundamental requirement for mobile robot operation in warehouse environments.
An accurate AprilTag map can be conveniently established with the help of
LiDAR-based SLAM. It is true that a LiDAR-based system is usually not
commercially competitive in contrast with a vision-based system, yet
fortunately for warehouse applications, only a single LiDAR-based SLAM system
is needed to establish an accurate AprilTag map, whereas a large amount of
visual localization systems can share this established AprilTag map for their
own operations. Therefore, the cost of a LiDAR-based SLAM system is actually
shared by the large amount of visual localization systems, and turns to be
acceptable and even negligible for practical warehouse applications. Once an
accurate AprilTag map is available, visual localization is realized as
recursive estimation that fuses AprilTag measurements (i.e. AprilTag detection
results) and robot motion data. AprilTag measurements may be nonlinear partial
measurements; this can be handled by the well-known extended Kalman filter
(EKF) in the spirit of local linearization. AprilTag measurements tend to have
temporal correlation as well; however, this cannot be reasonably handled by the
EKF. The split covariance intersection filter (Split CIF) is adopted to handle
temporal correlation among AprilTag measurements. The Split CIF (in the spirit
of local linearization) can also handle AprilTag nonlinear partial
measurements. The Split CIF based visual localization system incorporates a
measurement adaptive mechanism to handle outliers in AprilTag measurements and
adopts a dynamic initialization mechanism to address the kidnapping problem. A
comparative study in real warehouse environments demonstrates the potential and
advantage of the Split CIF based visual localization solution
Probabilistic Surfel Fusion for Dense LiDAR Mapping
With the recent development of high-end LiDARs, more and more systems are
able to continuously map the environment while moving and producing spatially
redundant information. However, none of the previous approaches were able to
effectively exploit this redundancy in a dense LiDAR mapping problem. In this
paper, we present a new approach for dense LiDAR mapping using probabilistic
surfel fusion. The proposed system is capable of reconstructing a high-quality
dense surface element (surfel) map from spatially redundant multiple views.
This is achieved by a proposed probabilistic surfel fusion along with a
geometry considered data association. The proposed surfel data association
method considers surface resolution as well as high measurement uncertainty
along its beam direction which enables the mapping system to be able to control
surface resolution without introducing spatial digitization. The proposed
fusion method successfully suppresses the map noise level by considering
measurement noise caused by laser beam incident angle and depth distance in a
Bayesian filtering framework. Experimental results with simulated and real data
for the dense surfel mapping prove the ability of the proposed method to
accurately find the canonical form of the environment without further
post-processing.Comment: Accepted in Multiview Relationships in 3D Data 2017 (IEEE
International Conference on Computer Vision Workshops
Planetary rovers and data fusion
This research will investigate the problem of position estimation for planetary rovers.
Diverse algorithmic filters are available for collecting input data and transforming
that data to useful information for the purpose of position estimation process. The
terrain has sandy soil which might cause slipping of the robot, and small stones and
pebbles which can affect trajectory.
The Kalman Filter, a state estimation algorithm was used for fusing the sensor data
to improve the position measurement of the rover. For the rover application the
locomotion and errors accumulated by the rover is compensated by the Kalman
Filter. The movement of a rover in a rough terrain is challenging especially with
limited sensors to tackle the problem. Thus, an initiative was taken to test drive
the rover during the field trial and expose the mobile platform to hard ground and
soft ground(sand). It was found that the LSV system produced speckle image and
values which proved invaluable for further research and for the implementation of
data fusion.
During the field trial,It was also discovered that in a at hard surface the problem
of the steering rover is minimal. However, when the rover was under the influence
of soft sand the rover tended to drift away and struggled to navigate.
This research introduced the laser speckle velocimetry as an alternative for odometric
measurement. LSV data was gathered during the field trial to further simulate under
MATLAB, which is a computational/mathematical programming software used for
the simulation of the rover trajectory. The wheel encoders came with associated
errors during the position measurement process. This was observed during the
earlier field trials too. It was also discovered that the Laser Speckle Velocimetry
measurement was able to measure accurately the position measurement but at the
same time sensitivity of the optics produced noise which needed to be addressed as
error problem.
Though the rough terrain is found in Mars, this paper is applicable to a terrestrial
robot on Earth. There are regions in Earth which have rough terrains and regions
which are hard to measure with encoders. This is especially true concerning icy
places like Antarctica, Greenland and others.
The proposed implementation for the development of the locomotion system is to
model a system for the position estimation through the use of simulation and collecting data using the LSV. Two simulations are performed, one is the differential
drive of a two wheel robot and the second involves the fusion of the differential drive
robot data and the LSV data collected from the rover testbed. The results have
been positive. The expected contributions from the research work includes a design
of a LSV system to aid the locomotion measurement system.
Simulation results show the effect of different sensors and velocity of the robot. The
kalman filter improves the position estimation process
Modeling and Control for Vision Based Rear Wheel Drive Robot and Solving Indoor SLAM Problem Using LIDAR
abstract: To achieve the ambitious long-term goal of a feet of cooperating Flexible Autonomous
Machines operating in an uncertain Environment (FAME), this thesis addresses several
critical modeling, design, control objectives for rear-wheel drive ground vehicles.
Toward this ambitious goal, several critical objectives are addressed. One central objective of the thesis was to show how to build low-cost multi-capability robot platform
that can be used for conducting FAME research.
A TFC-KIT car chassis was augmented to provide a suite of substantive capabilities.
The augmented vehicle (FreeSLAM Robot) costs less than 2000.
All demonstrations presented involve rear-wheel drive FreeSLAM robot. The following
summarizes the key hardware demonstrations presented and analyzed:
(1)Cruise (v, ) control along a line,
(2) Cruise (v, ) control along a curve,
(3) Planar (x, y) Cartesian Stabilization for rear wheel drive vehicle,
(4) Finish the track with camera pan tilt structure in minimum time,
(5) Finish the track without camera pan tilt structure in minimum time,
(6) Vision based tracking performance with different cruise speed vx,
(7) Vision based tracking performance with different camera fixed look-ahead distance L,
(8) Vision based tracking performance with different delay Td from vision subsystem,
(9) Manually remote controlled robot to perform indoor SLAM,
(10) Autonomously line guided robot to perform indoor SLAM.
For most cases, hardware data is compared with, and corroborated by, model based
simulation data. In short, the thesis uses low-cost self-designed rear-wheel
drive robot to demonstrate many capabilities that are critical in order to reach the
longer-term FAME goal.Dissertation/ThesisDefense PresentationMasters Thesis Electrical Engineering 201
- …