4 research outputs found

    Simultaneous Localization and Mapping (SLAM) for Autonomous Driving: Concept and Analysis

    Get PDF
    The Simultaneous Localization and Mapping (SLAM) technique has achieved astonishing progress over the last few decades and has generated considerable interest in the autonomous driving community. With its conceptual roots in navigation and mapping, SLAM outperforms some traditional positioning and localization techniques since it can support more reliable and robust localization, planning, and controlling to meet some key criteria for autonomous driving. In this study the authors first give an overview of the different SLAM implementation approaches and then discuss the applications of SLAM for autonomous driving with respect to different driving scenarios, vehicle system components and the characteristics of the SLAM approaches. The authors then discuss some challenging issues and current solutions when applying SLAM for autonomous driving. Some quantitative quality analysis means to evaluate the characteristics and performance of SLAM systems and to monitor the risk in SLAM estimation are reviewed. In addition, this study describes a real-world road test to demonstrate a multi-sensor-based modernized SLAM procedure for autonomous driving. The numerical results show that a high-precision 3D point cloud map can be generated by the SLAM procedure with the integration of Lidar and GNSS/INS. Online four–five cm accuracy localization solution can be achieved based on this pre-generated map and online Lidar scan matching with a tightly fused inertial system

    Mobile Robot Manipulator System Design for Localization and Mapping in Cluttered Environments

    Get PDF
    In this thesis, a compact mobile robot has been developed to build real-time 3D maps of hazards and cluttered environments inside damaged buildings for rescue tasks using visual Simultaneous Localization And Mapping (SLAM) algorithms. In order to maximize the survey area in such environments, this mobile robot is designed with four omni-wheels and equipped with a 6 Degree of Freedom (DOF) robotic arm carrying a stereo camera mounted on its end-effector. The aim of using this mobile articulated robotic system is monitor different types of regions within the area of interest, ranging from wide open spaces to smaller and irregular regions behind narrow gaps. In the first part of the thesis, the robot system design is presented in detail, including the kinematic systems of the omni-wheeled mobile platform and the 6-DOF robotic arm, estimation of the biases in parameters of these kinematic systems, the sensors and calibration of their parameters. These parameters are important for the sensor fusion utilized in the next part of the thesis, where two operation modes are proposed to retain the camera pose when the visual SLAM algorithms fail due to variety of the region types. In the second part, an integrated sensor data fusion, odometry and SLAM scheme is developed, where the camera poses are estimated using forward kinematic equations of the robotic arm and fused to the visual SLAM and odometry algorithms. A modified wavefront algorithm with reduced computational complexity is used to find the shortest path to reach the identified goal points. Finally, a dynamic control scheme is developed for path tracking and motion control of the mobile platform and the robot arm, with sub-systems in the form of PD controllers and extended Kalman filters. The overall system design is physically implemented on a prototype integrated mobile robot platform and successfully tested in real-time

    Analytical simultaneous localization and mapping without linearization

    No full text
    Thesis: Ph. D., Massachusetts Institute of Technology, Department of Mechanical Engineering, 2017.Cataloged from PDF version of thesis.Includes bibliographical references (pages 161-173).This thesis solves the classical problem of simultaneous localization and mapping (SLAM) in a fashion which avoids linearized approximations altogether. Based on creating virtual synthetic measurements, the algorithm uses a linear time-varying (LTV) Kalman observer, bypassing errors and approximations brought by the linearization process in traditional extended Kalman filtering (EKF) SLAM. Convergence rates of the algorithm are established using contraction analysis. Different combinations of sensor information can be exploited, such as bearing measurements, range measurements, optical flow, or time-to-contact. As illustrated in simulations, the proposed algorithm can solve SLAM problems in both 2D and 3D scenarios with guaranteed convergence rates in a full nonlinear context. A novel distributed algorithm SLAM-DUNK is proposed in the thesis. The algorithm uses virtual vehicles to achieve information exclusively from corresponding landmarks. Computation complexity is reduced to 0(n), with simulations on Victoria Park dataset to support the validity of the algorithm. In the final section of the thesis, we propose a general framework for cooperative navigation and mapping. The frameworks developed for three different use cases use the null space terms of SLAM problem to guarantee that robots starting with unknown initial conditions could converge to a shared consensus coordinate system with estimates reflecting the truth.by Feng Tan.Ph. D
    corecore