584 research outputs found

    End-to-End Nano-Drone Obstacle Avoidance for Indoor Exploration

    Get PDF
    Autonomous navigation of drones using computer vision has achieved promising performance. Nano-sized drones based on edge computing platforms are lightweight, flexible, and cheap; thus, they are suitable for exploring narrow spaces. However, due to their extremely limited computing power and storage, vision algorithms designed for high-performance GPU platforms cannot be used for nano-drones. To address this issue, this paper presents a lightweight CNN depth estimation network deployed on nano-drones for obstacle avoidance. Inspired by knowledge distillation (KD), a Channel-Aware Distillation Transformer (CADiT) is proposed to facilitate the small network to learn knowledge from a larger network. The proposed method is validated on the KITTI dataset and tested on a Crazyflie nano-drone with an ultra-low power microprocessor GAP8. This paper also implements a communication pipe so that the collected images can be streamed to a laptop through the on-board Wi-Fi module in real-time, enabling an offline reconstruction of the environment

    Bird\u27s Eye View: Cooperative Exploration by UGV and UAV

    Get PDF
    This paper proposes a solution to the problem of cooperative exploration using an Unmanned Ground Vehicle (UGV) and an Unmanned Aerial Vehicle (UAV). More specifically, the UGV navigates through the free space, and the UAV provides enhanced situational awareness via its higher vantage point. The motivating application is search and rescue in a damaged building. A camera atop the UGV is used to track a fiducial tag on the underside of the UAV, allowing the UAV to maintain a fixed pose relative to the UGV. Furthermore, the UAV uses its front facing camera to provide a birds-eye-view to the remote operator, allowing for observation beyond obstacles that obscure the UGV’s sensors. The proposed approach has been tested using a TurtleBot 2 equipped with a Hokuyo laser ranger finder and a Parrot Bebop 2. Experimental results demonstrate the feasibility of this approach. This work is based on several open source packages and the generated code will be available online

    On-Board Relative Bearing Estimation for Teams of Drones Using Sound

    Get PDF
    In a team of autonomous drones, individual knowledge about the relative location of teammates is essential. Existing relative positioning solutions for teams of small drones mostly rely on external systems such as motion tracking cameras or GPS satellites that might not always be accessible. In this letter, we describe an onboard solution to measure the 3-D relative direction between drones using sound as the main source of information. First, we describe a method to measure the directions of other robots from perceiving their engine sounds in the absence of self-engine noise. We then extend the method to use active acoustic signaling to obtain the relative directions in the presence of self-engine noise, to increase the detection range, and to discriminate the identity of robots. Methods are evaluated in real world experiments and a fully autonomous leader-following behavior is illustrated with two drones using the proposed system

    Fully Onboard AI-powered Human-Drone Pose Estimation on Ultra-low Power Autonomous Flying Nano-UAVs

    Get PDF
    Many emerging applications of nano-sized unmanned aerial vehicles (UAVs), with a few form-factor, revolve around safely interacting with humans in complex scenarios, for example, monitoring their activities or looking after people needing care. Such sophisticated autonomous functionality must be achieved while dealing with severe constraints in payload, battery, and power budget ( 100). In this work, we attack a complex task going from perception to control: to estimate and maintain the nano-UAV’s relative 3D pose with respect to a person while they freely move in the environment – a task that, to the best of our knowledge, has never previously been targeted with fully onboard computation on a nano-sized UAV. Our approach is centered around a novel vision-based deep neural network (DNN), called PULP-Frontnet, designed for deployment on top of a parallel ultra-low-power (PULP) processor aboard a nano-UAV. We present a vertically integrated approach starting from the DNN model design, training, and dataset augmentation down to 8-bit quantization and deployment in-field. PULP-Frontnet can operate in real-time (up to 135frame/), consuming less than 87 for processing at peak throughput and down to 0.43/frame in the most energy-efficient operating point. Field experiments demonstrate a closed-loop top-notch autonomous navigation capability, with a tiny 27-grams Crazyflie 2.1 nano-UAV. Compared against an ideal sensing setup, onboard pose inference yields excellent drone behavior in terms of median absolute errors, such as positional (onboard: 41, ideal: 26) and angular (onboard: 3.7, ideal: 4.1). We publicly release videos and the source code of our work

    Fully Onboard AI-Powered Human-Drone Pose Estimation on Ultralow-Power Autonomous Flying Nano-UAVs

    Get PDF
    Many emerging applications of nano-sized unmanned aerial vehicles (UAVs), with a few cm(2) form-factor, revolve around safely interacting with humans in complex scenarios, for example, monitoring their activities or looking after people needing care. Such sophisticated autonomous functionality must be achieved while dealing with severe constraints in payload, battery, and power budget (similar to 100 mW). In this work, we attack a complex task going from perception to control: to estimate and maintain the nano-UAV's relative 3-D pose with respect to a person while they freely move in the environment-a task that, to the best of our knowledge, has never previously been targeted with fully onboard computation on a nano-sized UAV. Our approach is centered around a novel vision-based deep neural network (DNN), called Frontnet, designed for deployment on top of a parallel ultra-low power (PULP) processor aboard a nano-UAV. We present a vertically integrated approach starting from the DNN model design, training, and dataset augmentation down to 8-bit quantization and deployment in-field. PULP-Frontnet can operate in real-time (up to 135 frame/s), consuming less than 87 mW for processing at peak throughput and down to 0.43 mJ/frame in the most energy-efficient operating point. Field experiments demonstrate a closed-loop top-notch autonomous navigation capability, with a tiny 27-g Crazyflie 2.1 nano-UAV. Compared against an ideal sensing setup, onboard pose inference yields excellent drone behavior in terms of median absolute errors, such as positional (onboard: 41 cm, ideal: 26 cm) and angular (onboard: 3.7 degrees, ideal: 4.1 degrees). We publicly release videos and the source code of our work
    • …
    corecore