7,515 research outputs found

    Autonomous navigation of micro aerial vehicles using high-rate and low-cost sensors

    Get PDF
    The final publication is available at link.springer.comThe combination of visual and inertial sensors for state estimation has recently found wide echo in the robotics community, especially in the aerial robotics field, due to the lightweight and complementary characteristics of the sensors data. However, most state estimation systems based on visual-inertial sensing suffer from severe processor requirements, which in many cases make them impractical. In this paper, we propose a simple, low-cost and high rate method for state estimation enabling autonomous flight of micro aerial vehicles, which presents a low computational burden. The proposed state estimator fuses observations from an inertial measurement unit, an optical flow smart camera and a time-of-flight range sensor. The smart camera provides optical flow measurements up to a rate of 200 Hz, avoiding the computational bottleneck to the main processor produced by all image processing requirements. To the best of our knowledge, this is the first example of extending the use of these smart cameras from hovering-like motions to odometry estimation, producing estimates that are usable during flight times of several minutes. In order to validate and defend the simplest algorithmic solution, we investigate the performances of two Kalman filters, in the extended and error-state flavors, alongside with a large number of algorithm modifications defended in earlier literature on visual-inertial odometry, showing that their impact on filter performance is minimal. To close the control loop, a non-linear controller operating in the special Euclidean group SE(3) is able to drive, based on the estimated vehicle’s state, a quadrotor platform in 3D space guaranteeing the asymptotic stability of 3D position and heading. All the estimation and control tasks are solved on board and in real time on a limited computational unit. The proposed approach is validated through simulations and experimental results, which include comparisons with ground-truth data provided by a motion capture system. For the benefit of the community, we make the source code public.Peer ReviewedPostprint (author's final draft

    A 64mW DNN-based Visual Navigation Engine for Autonomous Nano-Drones

    Full text link
    Fully-autonomous miniaturized robots (e.g., drones), with artificial intelligence (AI) based visual navigation capabilities are extremely challenging drivers of Internet-of-Things edge intelligence capabilities. Visual navigation based on AI approaches, such as deep neural networks (DNNs) are becoming pervasive for standard-size drones, but are considered out of reach for nanodrones with size of a few cm2{}^\mathrm{2}. In this work, we present the first (to the best of our knowledge) demonstration of a navigation engine for autonomous nano-drones capable of closed-loop end-to-end DNN-based visual navigation. To achieve this goal we developed a complete methodology for parallel execution of complex DNNs directly on-bard of resource-constrained milliwatt-scale nodes. Our system is based on GAP8, a novel parallel ultra-low-power computing platform, and a 27 g commercial, open-source CrazyFlie 2.0 nano-quadrotor. As part of our general methodology we discuss the software mapping techniques that enable the state-of-the-art deep convolutional neural network presented in [1] to be fully executed on-board within a strict 6 fps real-time constraint with no compromise in terms of flight results, while all processing is done with only 64 mW on average. Our navigation engine is flexible and can be used to span a wide performance range: at its peak performance corner it achieves 18 fps while still consuming on average just 3.5% of the power envelope of the deployed nano-aircraft.Comment: 15 pages, 13 figures, 5 tables, 2 listings, accepted for publication in the IEEE Internet of Things Journal (IEEE IOTJ

    Fast, Autonomous Flight in GPS-Denied and Cluttered Environments

    Full text link
    One of the most challenging tasks for a flying robot is to autonomously navigate between target locations quickly and reliably while avoiding obstacles in its path, and with little to no a-priori knowledge of the operating environment. This challenge is addressed in the present paper. We describe the system design and software architecture of our proposed solution, and showcase how all the distinct components can be integrated to enable smooth robot operation. We provide critical insight on hardware and software component selection and development, and present results from extensive experimental testing in real-world warehouse environments. Experimental testing reveals that our proposed solution can deliver fast and robust aerial robot autonomous navigation in cluttered, GPS-denied environments.Comment: Pre-peer reviewed version of the article accepted in Journal of Field Robotic
    • …
    corecore