603 research outputs found

    Vision-Based Monocular SLAM in Micro Aerial Vehicle

    Get PDF
    Micro Aerial Vehicles (MAVs) are popular for their efficiency, agility, and lightweights. They can navigate in dynamic environments that cannot be accessed by humans or traditional aircraft. These MAVs rely on GPS and it will be difficult for GPS-denied areas where it is obstructed by buildings and other obstacles.  Simultaneous Localization and Mapping (SLAM) in an unknown environment can solve the aforementioned problems faced by flying robots.  A rotation and scale invariant visual-based solution, oriented fast and rotated brief (ORB-SLAM) is one of the best solutions for localization and mapping using monocular vision.  In this paper, an ORB-SLAM3 has been used to carry out the research on localizing micro-aerial vehicle Tello and mapping an unknown environment.  The effectiveness of ORB-SLAM3 was tested in a variety of indoor environments.   An integrated adaptive controller was used for an autonomous flight that used the 3D map, produced by ORB-SLAM3 and our proposed novel technique for robust initialization of the SLAM system during flight.  The results show that ORB-SLAM3 can provide accurate localization and mapping for flying robots, even in challenging scenarios with fast motion, large camera movements, and dynamic environments.  Furthermore, our results show that the proposed system is capable of navigating and mapping challenging indoor situations

    Comparative Study of Indoor Navigation Systems for Autonomous Flight

    Get PDF
    Recently, Unmanned Aerial Vehicles (UAVs) have attracted the society and researchers due to the capability to perform in economic, scientific and emergency scenarios, and are being employed in large number of applications especially during the hostile environments. They can operate autonomously for both indoor and outdoor applications mainly including search and rescue, manufacturing, forest fire tracking, remote sensing etc. For both environments, precise localization plays a critical role in order to achieve high performance flight and interacting with the surrounding objects. However, for indoor areas with degraded or denied Global Navigation Satellite System (GNSS) situation, it becomes challenging to control UAV autonomously especially where obstacles are unidentified. A large number of techniques by using various technologies are proposed to get rid of these limits. This paper provides a comparison of such existing solutions and technologies available for this purpose with their strengths and limitations. Further, a summary of current research status with unresolved issues and opportunities is provided that would provide research directions to the researchers of the similar interests

    Autonomous Navigation System for a Delivery Drone

    Full text link
    The use of delivery services is an increasing trend worldwide, further enhanced by the COVID pandemic. In this context, drone delivery systems are of great interest as they may allow for faster and cheaper deliveries. This paper presents a navigation system that makes feasible the delivery of parcels with autonomous drones. The system generates a path between a start and a final point and controls the drone to follow this path based on its localization obtained through GPS, 9DoF IMU, and barometer. In the landing phase, information of poses estimated by a marker (ArUco) detection technique using a camera, ultra-wideband (UWB) devices, and the drone's software estimation are merged by utilizing an Extended Kalman Filter algorithm to improve the landing precision. A vector field-based method controls the drone to follow the desired path smoothly, reducing vibrations or harsh movements that could harm the transported parcel. Real experiments validate the delivery strategy and allow to evaluate the performance of the adopted techniques. Preliminary results state the viability of our proposal for autonomous drone delivery.Comment: 12 pages, 15 figures, extended version of an paper published at the XXIII Brazilian Congress of Automatica, entitled "Desenvolvimento de um drone aut\^onomo para tarefas de entrega de carga

    Accurate position tracking with a single UWB anchor

    Full text link
    Accurate localization and tracking are a fundamental requirement for robotic applications. Localization systems like GPS, optical tracking, simultaneous localization and mapping (SLAM) are used for daily life activities, research, and commercial applications. Ultra-wideband (UWB) technology provides another venue to accurately locate devices both indoors and outdoors. In this paper, we study a localization solution with a single UWB anchor, instead of the traditional multi-anchor setup. Besides the challenge of a single UWB ranging source, the only other sensor we require is a low-cost 9 DoF inertial measurement unit (IMU). Under such a configuration, we propose continuous monitoring of UWB range changes to estimate the robot speed when moving on a line. Combining speed estimation with orientation estimation from the IMU sensor, the system becomes temporally observable. We use an Extended Kalman Filter (EKF) to estimate the pose of a robot. With our solution, we can effectively correct the accumulated error and maintain accurate tracking of a moving robot.Comment: Accepted by ICRA202

    Vision-based localization methods under GPS-denied conditions

    Full text link
    This paper reviews vision-based localization methods in GPS-denied environments and classifies the mainstream methods into Relative Vision Localization (RVL) and Absolute Vision Localization (AVL). For RVL, we discuss the broad application of optical flow in feature extraction-based Visual Odometry (VO) solutions and introduce advanced optical flow estimation methods. For AVL, we review recent advances in Visual Simultaneous Localization and Mapping (VSLAM) techniques, from optimization-based methods to Extended Kalman Filter (EKF) based methods. We also introduce the application of offline map registration and lane vision detection schemes to achieve Absolute Visual Localization. This paper compares the performance and applications of mainstream methods for visual localization and provides suggestions for future studies.Comment: 32 pages, 15 figure

    A 64mW DNN-based Visual Navigation Engine for Autonomous Nano-Drones

    Full text link
    Fully-autonomous miniaturized robots (e.g., drones), with artificial intelligence (AI) based visual navigation capabilities are extremely challenging drivers of Internet-of-Things edge intelligence capabilities. Visual navigation based on AI approaches, such as deep neural networks (DNNs) are becoming pervasive for standard-size drones, but are considered out of reach for nanodrones with size of a few cm2{}^\mathrm{2}. In this work, we present the first (to the best of our knowledge) demonstration of a navigation engine for autonomous nano-drones capable of closed-loop end-to-end DNN-based visual navigation. To achieve this goal we developed a complete methodology for parallel execution of complex DNNs directly on-bard of resource-constrained milliwatt-scale nodes. Our system is based on GAP8, a novel parallel ultra-low-power computing platform, and a 27 g commercial, open-source CrazyFlie 2.0 nano-quadrotor. As part of our general methodology we discuss the software mapping techniques that enable the state-of-the-art deep convolutional neural network presented in [1] to be fully executed on-board within a strict 6 fps real-time constraint with no compromise in terms of flight results, while all processing is done with only 64 mW on average. Our navigation engine is flexible and can be used to span a wide performance range: at its peak performance corner it achieves 18 fps while still consuming on average just 3.5% of the power envelope of the deployed nano-aircraft.Comment: 15 pages, 13 figures, 5 tables, 2 listings, accepted for publication in the IEEE Internet of Things Journal (IEEE IOTJ
    corecore