64 research outputs found

    From Monocular SLAM to Autonomous Drone Exploration

    Full text link
    Micro aerial vehicles (MAVs) are strongly limited in their payload and power capacity. In order to implement autonomous navigation, algorithms are therefore desirable that use sensory equipment that is as small, low-weight, and low-power consuming as possible. In this paper, we propose a method for autonomous MAV navigation and exploration using a low-cost consumer-grade quadrocopter equipped with a monocular camera. Our vision-based navigation system builds on LSD-SLAM which estimates the MAV trajectory and a semi-dense reconstruction of the environment in real-time. Since LSD-SLAM only determines depth at high gradient pixels, texture-less areas are not directly observed so that previous exploration methods that assume dense map information cannot directly be applied. We propose an obstacle mapping and exploration approach that takes the properties of our semi-dense monocular SLAM system into account. In experiments, we demonstrate our vision-based autonomous navigation and exploration system with a Parrot Bebop MAV

    Semi-dense SLAM on an FPGA SoC

    No full text
    Deploying advanced Simultaneous Localisation and Mapping, or SLAM, algorithms in autonomous low-power robotics will enable emerging new applications which require an accurate and information rich reconstruction of the environment. This has not been achieved so far because accuracy and dense 3D reconstruction come with a high computational complexity. This paper discusses custom hardware design on a novel platform for embedded SLAM, an FPGA-SoC, combining an embedded CPU and programmable logic on the same chip. The use of programmable logic, tightly integrated with an efficient multicore embedded CPU stands to provide an effective solution to this problem. In this work an average framerate of more than 4 frames/second for a resolution of 320×240 has been achieved with an estimated power of less than 1 Watt for the custom hardware. In comparison to the software-only version, running on a dual-core ARM processor, an acceleration of 2× has been achieved for LSD-SLAM, without any compromise in the quality of the result

    A low-cost vision-based unmanned aerial system for extremely low-light GPS-denied navigation and thermal imaging

    No full text
    A Low-Cost Vision-Based Unmanned Aerial System for Extremely Low-Light GPS-Denied Navigation and Thermal Imaging}, abstract = {This paper presents the design and implementation details of a complete unmanned aerial system (UAS) based on commercial-off-the-shelf (COTS) components, focusing on safety, security, search and rescue scenarios in GPS-denied environments. In particular, the aerial platform is capable of semi-autonomously navigating through extremely low-light, GPS-denied indoor environments based on onboard sensors only, including a downward-facing optical flow camera. Besides, an additional low-cost payload camera system is developed to stream both infrared video and visible light video to a ground station in real-time, for the purpose of detecting sign of life and hidden humans. The total cost of the complete system is estimated to be $1150, and the effectiveness of the system has been tested and validated in practical scenarios

    Vision-Based Monocular SLAM in Micro Aerial Vehicle

    Get PDF
    Micro Aerial Vehicles (MAVs) are popular for their efficiency, agility, and lightweights. They can navigate in dynamic environments that cannot be accessed by humans or traditional aircraft. These MAVs rely on GPS and it will be difficult for GPS-denied areas where it is obstructed by buildings and other obstacles.  Simultaneous Localization and Mapping (SLAM) in an unknown environment can solve the aforementioned problems faced by flying robots.  A rotation and scale invariant visual-based solution, oriented fast and rotated brief (ORB-SLAM) is one of the best solutions for localization and mapping using monocular vision.  In this paper, an ORB-SLAM3 has been used to carry out the research on localizing micro-aerial vehicle Tello and mapping an unknown environment.  The effectiveness of ORB-SLAM3 was tested in a variety of indoor environments.   An integrated adaptive controller was used for an autonomous flight that used the 3D map, produced by ORB-SLAM3 and our proposed novel technique for robust initialization of the SLAM system during flight.  The results show that ORB-SLAM3 can provide accurate localization and mapping for flying robots, even in challenging scenarios with fast motion, large camera movements, and dynamic environments.  Furthermore, our results show that the proposed system is capable of navigating and mapping challenging indoor situations

    Collaborating Low Cost Micro Aerial Vehicles: A Demonstration

    Get PDF
    In this paper we demonstrate our Distributed Collaborative Tracking and Mapping (DCTAM) system for collaborative localisation and mapping with teams of Micro-Aerial Vehicle's MAVs. DCTAM uses a distributed architecture which allows us to run both image capture and frame-to-frame tracking on-board the MAV while offloading the more computationally demanding tasks of map creation/refinement to an off-board computer. The low computational cost of the localisation components of our system allow us to run additional software on-board such as an Extended Kalman Filter (EKF) for full state estimation and a PID-based Position Controller. This allows us to demonstrate complete cooperative autonomous operation

    Collaborating Low Cost Micro Aerial Vehicles: A Demonstration

    Get PDF
    In this paper we demonstrate our Distributed Collaborative Tracking and Mapping (DCTAM) system for collaborative localisation and mapping with teams of Micro-Aerial Vehicle's MAVs. DCTAM uses a distributed architecture which allows us to run both image capture and frame-to-frame tracking on-board the MAV while offloading the more computationally demanding tasks of map creation/refinement to an off-board computer. The low computational cost of the localisation components of our system allow us to run additional software on-board such as an Extended Kalman Filter (EKF) for full state estimation and a PID-based Position Controller. This allows us to demonstrate complete cooperative autonomous operation

    Collaborating Low Cost Micro Aerial Vehicles: A Demonstration

    Get PDF
    In this paper we demonstrate our Distributed Collaborative Tracking and Mapping (DCTAM) system for collaborative localization and mapping with teams of Micro-Aerial Vehicle’s MAVs. DCTAM uses a distributed architecture which allows us to run both image capture and frame-to-frame tracking on-board the MAV while offloading the more computationally demanding tasks of map creation/refinement to an offboard computer. The low computational cost of the localisation components of our system allow us to run additional software on-board such as an Extended Kalman Filter (EKF) for full state estimation and a PID-based Position Controller. This allows us to demonstrate complete cooperative autonomous operation

    A low-cost vision-based unmanned aerial system for extremely low-light GPS-denied navigation and thermal imaging

    No full text
    This paper presents the design and implementation details of a complete unmanned aerial system (UAS) based on commercial-off-the-shelf (COTS) components, focusing on safety, security, search and rescue scenarios in GPS-denied environments. In particular, The aerial platform is capable of semi-autonomously navigating through extremely low-light, GPS-denied indoor environments based on onboard sensors only, including a downward-facing optical flow camera. Besides, an additional low-cost payload camera system is developed to stream both infra-red video and visible light video to a ground station in real-time, for the purpose of detecting sign of life and hidden humans. The total cost of the complete system is estimated to be $1150, and the effectiveness of the system has been tested and validated in practical scenarios

    Low-Cost Multiple-MAV SLAM Using Open Source Software

    Get PDF
    We demonstrate a multiple micro aerial vehicle (MAV) system capable of supporting autonomous exploration and navigation in unknown environments using only a sensor commonly found in low-cost, commercially available MAVs—a front-facing monocular camera. We adapt a popular open source monocular SLAM library, ORB-SLAM, to support multiple inputs and present a system capable of effective cross-map alignment that can be theoretically generalized for use with other monocular SLAM libraries. Using our system, a single central ground control station is capable of supporting up to five MAVs simultaneously without a loss in mapping quality as compared to single-MAV ORB-SLAM. We conduct testing using both benchmark datasets and real-world trials to demonstrate the capability and real-time effectiveness
    • …
    corecore