6 research outputs found

    Land & Localize: An Infrastructure-free and Scalable Nano-Drones Swarm with UWB-based Localization

    Full text link
    Relative localization is a crucial functional block of any robotic swarm. We address it in a fleet of nano-drones characterized by a 10 cm-scale form factor, which makes them highly versatile but also strictly limited in their onboard power envelope. State-of-the-Art solutions leverage Ultra-WideBand (UWB) technology, allowing distance range measurements between peer nano-drones and a stationary infrastructure of multiple UWB anchors. Therefore, we propose an UWB-based infrastructure-free nano-drones swarm, where part of the fleet acts as dynamic anchors, i.e., anchor-drones (ADs), capable of automatic deployment and landing. By varying the Ads' position constraint, we develop three alternative solutions with different trade-offs between flexibility and localization accuracy. In-field results, with four flying mission-drones (MDs), show a localization root mean square error (RMSE) spanning from 15.3 cm to 27.8 cm, at most. Scaling the number of MDs from 4 to 8, the RMSE marginally increases, i.e., less than 10 cm at most. The power consumption of the MDs' UWB module amounts to 342 mW. Ultimately, compared to a fixed-infrastructure commercial solution, our infrastructure-free system can be deployed anywhere and rapidly by taking 5.7 s to self-localize 4 ADs with a localization RMSE of up to 12.3% in the most challenging case with 8 MDs

    An Open Source and Open Hardware Deep Learning-Powered Visual Navigation Engine for Autonomous Nano-UAVs

    Get PDF
    Nano-size unmanned aerial vehicles (UAVs), with few centimeters of diameter and sub-10 Watts of total power budget, have so far been considered incapable of running sophisticated visual-based autonomous navigation software without external aid from base-stations, ad-hoc local positioning infrastructure, and powerful external computation servers. In this work, we present what is, to the best of our knowledge, the first 27g nano-UAV system able to run aboard an end-to-end, closed-loop visual pipeline for autonomous navigation based on a state-of-the-art deep-learning algorithm, built upon the open-source CrazyFlie 2.0 nano-quadrotor. Our visual navigation engine is enabled by the combination of an ultra-low power computing device (the GAP8 system-on-chip) with a novel methodology for the deployment of deep convolutional neural networks (CNNs). We enable onboard real-time execution of a state-of-the-art deep CNN at up to 18Hz. Field experiments demonstrate that the system's high responsiveness prevents collisions with unexpected dynamic obstacles up to a flight speed of 1.5m/s. In addition, we also demonstrate the capability of our visual navigation engine of fully autonomous indoor navigation on a 113m previously unseen path. To share our key findings with the embedded and robotics communities and foster further developments in autonomous nano-UAVs, we publicly release all our code, datasets, and trained networks

    A 64mW DNN-based Visual Navigation Engine for Autonomous Nano-Drones

    Full text link
    Fully-autonomous miniaturized robots (e.g., drones), with artificial intelligence (AI) based visual navigation capabilities are extremely challenging drivers of Internet-of-Things edge intelligence capabilities. Visual navigation based on AI approaches, such as deep neural networks (DNNs) are becoming pervasive for standard-size drones, but are considered out of reach for nanodrones with size of a few cm2{}^\mathrm{2}. In this work, we present the first (to the best of our knowledge) demonstration of a navigation engine for autonomous nano-drones capable of closed-loop end-to-end DNN-based visual navigation. To achieve this goal we developed a complete methodology for parallel execution of complex DNNs directly on-bard of resource-constrained milliwatt-scale nodes. Our system is based on GAP8, a novel parallel ultra-low-power computing platform, and a 27 g commercial, open-source CrazyFlie 2.0 nano-quadrotor. As part of our general methodology we discuss the software mapping techniques that enable the state-of-the-art deep convolutional neural network presented in [1] to be fully executed on-board within a strict 6 fps real-time constraint with no compromise in terms of flight results, while all processing is done with only 64 mW on average. Our navigation engine is flexible and can be used to span a wide performance range: at its peak performance corner it achieves 18 fps while still consuming on average just 3.5% of the power envelope of the deployed nano-aircraft.Comment: 15 pages, 13 figures, 5 tables, 2 listings, accepted for publication in the IEEE Internet of Things Journal (IEEE IOTJ

    Real-time interval type-2 fuzzy control of an unmanned aerial vehicle with flexible cable-connected payload

    Get PDF
    This study presents the design and real-time applications of an Interval Type-2 Fuzzy PID (IT2-FPID) control system on an unmanned aerial vehicle (UAV) with a flexible cable-connected payload in comparison to the PID and Type-1 Fuzzy PID (T1-FPID) counterparts. The IT2-FPID control has significant stability, disturbance rejection, and response time advantages. To prove and show these advantages, the DJI Tello, a commercial UAV, is used with a flexible cable-connected payload to test the robustness of PID, T1-FPID, and IT2-FPID controllers. First, the optimal coefficients of the compared controllers are found using the Big Bang–Big Crunch algorithm via the nonlinear UAV model without the payload. Second, once optimised, the controllers are tested using several scenarios, including disturbing the payload and the coverage path planning area to examine their robustness. Third, the controller performance results are evaluated according to reference achievement and point-based tracking under disturbances. Finally, the superiority of the IT2-FPID controller is shown via simulations and real-time experiments with a better overshoot, a faster settling time, and good properties of disturbance rejection compared with the PID and the T1-FPID controllers

    Target following on nano-scale Unmanned Aerial Vehicles

    No full text

    Target following on nano-scale Unmanned Aerial Vehicles

    No full text
    Unmanned Aerial Vehicles (UAVs) with high level autonomous navigation capabilities are a hot topic both in industry and academia due to their numerous applications. However, autonomous navigation algorithms are demanding from the computational standpoint, and it is very challenging to run them on-board of nano-scale UAVs (i.e., few centimeters of diameter) because of the limited capabilities of their MCU-based controllers. This work focuses on the object tracking capability, (i.e., target following capability) on such nano-UAVs. We present a lightweight hardware-software solution, bringing autonomous navigation on a commercial platform using only on-board computational resources. Furthermore, we evaluate a parallel ultra-low-power (PULP) platform that enables the execution of even more sophisticated algorithms. Experimental results demonstrate the benefits of our solution, achieving accurate target following using an ARM Cortex M4 microcontroller consuming \ue2\u89\u88 130mW. Our evaluation on a PULP architecture shows the proposed solution running up-To 60 frame-per second in a power envelope of \ue2\u89\u88 30mW leaving more than 70% of the computational resources free for further on-board processing of more complex algorithms
    corecore