528 research outputs found

    Active Mapping and Robot Exploration: A Survey

    Get PDF
    Simultaneous localization and mapping responds to the problem of building a map of the environment without any prior information and based on the data obtained from one or more sensors. In most situations, the robot is driven by a human operator, but some systems are capable of navigating autonomously while mapping, which is called native simultaneous localization and mapping. This strategy focuses on actively calculating the trajectories to explore the environment while building a map with a minimum error. In this paper, a comprehensive review of the research work developed in this field is provided, targeting the most relevant contributions in indoor mobile robotics.This research was funded by the ELKARTEK project ELKARBOT KK-2020/00092 of the Basque Government

    Autonomous 3D Exploration of Large Structures Using an UAV Equipped with a 2D LIDAR

    Get PDF
    This paper addressed the challenge of exploring large, unknown, and unstructured industrial environments with an unmanned aerial vehicle (UAV). The resulting system combined well-known components and techniques with a new manoeuvre to use a low-cost 2D laser to measure a 3D structure. Our approach combined frontier-based exploration, the Lazy Theta* path planner, and a flyby sampling manoeuvre to create a 3D map of large scenarios. One of the novelties of our system is that all the algorithms relied on the multi-resolution of the octomap for the world representation. We used a Hardware-in-the-Loop (HitL) simulation environment to collect accurate measurements of the capability of the open-source system to run online and on-board the UAV in real-time. Our approach is compared to different reference heuristics under this simulation environment showing better performance in regards to the amount of explored space. With the proposed approach, the UAV is able to explore 93% of the search space under 30 min, generating a path without repetition that adjusts to the occupied space covering indoor locations, irregular structures, and suspended obstaclesUnión Europea Marie Sklodowska-Curie 64215Unión Europea MULTIDRONE (H2020-ICT-731667)Uniión Europea HYFLIERS (H2020-ICT-779411

    Marker based Thermal-Inertial Localization for Aerial Robots in Obscurant Filled Environments

    Full text link
    For robotic inspection tasks in known environments fiducial markers provide a reliable and low-cost solution for robot localization. However, detection of such markers relies on the quality of RGB camera data, which degrades significantly in the presence of visual obscurants such as fog and smoke. The ability to navigate known environments in the presence of obscurants can be critical for inspection tasks especially, in the aftermath of a disaster. Addressing such a scenario, this work proposes a method for the design of fiducial markers to be used with thermal cameras for the pose estimation of aerial robots. Our low cost markers are designed to work in the long wave infrared spectrum, which is not affected by the presence of obscurants, and can be affixed to any object that has measurable temperature difference with respect to its surroundings. Furthermore, the estimated pose from the fiducial markers is fused with inertial measurements in an extended Kalman filter to remove high frequency noise and error present in the fiducial pose estimates. The proposed markers and the pose estimation method are experimentally evaluated in an obscurant filled environment using an aerial robot carrying a thermal camera.Comment: 10 pages, 5 figures, Published in International Symposium on Visual Computing 201

    Perception-aware receding horizon trajectory planning for multicopters with visual-inertial odometry

    Full text link
    Visual inertial odometry (VIO) is widely used for the state estimation of multicopters, but it may function poorly in environments with few visual features or in overly aggressive flights. In this work, we propose a perception-aware collision avoidance trajectory planner for multicopters, that may be used with any feature-based VIO algorithm. Our approach is able to fly the vehicle to a goal position at fast speed, avoiding obstacles in an unknown stationary environment while achieving good VIO state estimation accuracy. The proposed planner samples a group of minimum jerk trajectories and finds collision-free trajectories among them, which are then evaluated based on their speed to the goal and perception quality. Both the motion blur of features and their locations are considered for the perception quality. Our novel consideration of the motion blur of features enables automatic adaptation of the trajectory's aggressiveness under environments with different light levels. The best trajectory from the evaluation is tracked by the vehicle and is updated in a receding horizon manner when new images are received from the camera. Only generic assumptions about the VIO are made, so that the planner may be used with various existing systems. The proposed method can run in real-time on a small embedded computer on board. We validated the effectiveness of our proposed approach through experiments in both indoor and outdoor environments. Compared to a perception-agnostic planner, the proposed planner kept more features in the camera's view and made the flight less aggressive, making the VIO more accurate. It also reduced VIO failures, which occurred for the perception-agnostic planner but not for the proposed planner. The ability of the proposed planner to fly through dense obstacles was also validated. The experiment video can be found at https://youtu.be/qO3LZIrpwtQ.Comment: 12 page
    corecore