861 research outputs found
Autonomous Localization Of A Uav In A 3d Cad Model
This thesis presents a novel method of indoor localization and autonomous navigation of Unmanned Aerial Vehicles(UAVs) within a building, given a prebuilt Computer Aided Design(CAD) model of the building. The proposed system is novel in that it leverages the support of machine learning and traditional computer vision techniques to provide a robust method of localizing and navigating a drone autonomously in indoor and GPS denied environments leveraging preexisting knowledge of the environment. The goal of this work is to devise a method to enable a UAV to deduce its current pose within a CAD model that is fast and accurate while also maintaining efficient use of resources. A 3-Dimensional CAD model of the building to be navigated through is provided as input to the system along with the required goal position. Initially, the UAV has no idea of its location within the building. The system, comprising a stereo camera system and an Inertial Measurement Unit(IMU) as its sensors, then generates a globally consistent map of its surroundings using a Simultaneous Localization and Mapping (SLAM) algorithm. In addition to the map, it also stores spatially correlated 3D features. These 3D features are then used to generate correspondences between the SLAM map and the 3D CAD model. The correspondences are then used to generate a transformation between the SLAM map and the 3D CAD model, thus effectively localizing the UAV in the 3D CAD model. Our method has been tested to successfully localize the UAV in the test building in an average of 15 seconds in the different scenarios tested contingent upon the abundance of target features in the observed data. Due to the absence of a motion capture system, the results have been verified by the placement of tags on the ground at strategic known locations in the building and measuring the error in the projection of the current UAV location on the ground with the tag
A Systematic Literature Survey of Unmanned Aerial Vehicle Based Structural Health Monitoring
Unmanned Aerial Vehicles (UAVs) are being employed in a multitude of civil applications owing to their ease of use, low maintenance, affordability, high-mobility, and ability to hover. UAVs are being utilized for real-time monitoring of road traffic, providing wireless coverage, remote sensing, search and rescue operations, delivery of goods, security and surveillance, precision agriculture, and civil infrastructure inspection. They are the next big revolution in technology and civil infrastructure, and it is expected to dominate more than $45 billion market value. The thesis surveys the UAV assisted Structural Health Monitoring or SHM literature over the last decade and categorize UAVs based on their aerodynamics, payload, design of build, and its applications. Further, the thesis presents the payload product line to facilitate the SHM tasks, details the different applications of UAVs exploited in the last decade to support civil structures, and discusses the critical challenges faced in UASHM applications across various domains. Finally, the thesis presents two artificial neural network-based structural damage detection models and conducts a detailed performance evaluation on multiple platforms like edge computing and cloud computing
A Comprehensive Review of AI-enabled Unmanned Aerial Vehicle: Trends, Vision , and Challenges
In recent years, the combination of artificial intelligence (AI) and unmanned
aerial vehicles (UAVs) has brought about advancements in various areas. This
comprehensive analysis explores the changing landscape of AI-powered UAVs and
friendly computing in their applications. It covers emerging trends, futuristic
visions, and the inherent challenges that come with this relationship. The
study examines how AI plays a role in enabling navigation, detecting and
tracking objects, monitoring wildlife, enhancing precision agriculture,
facilitating rescue operations, conducting surveillance activities, and
establishing communication among UAVs using environmentally conscious computing
techniques. By delving into the interaction between AI and UAVs, this analysis
highlights the potential for these technologies to revolutionise industries
such as agriculture, surveillance practices, disaster management strategies,
and more. While envisioning possibilities, it also takes a look at ethical
considerations, safety concerns, regulatory frameworks to be established, and
the responsible deployment of AI-enhanced UAV systems. By consolidating
insights from research endeavours in this field, this review provides an
understanding of the evolving landscape of AI-powered UAVs while setting the
stage for further exploration in this transformative domain
Brain over Brawn -- Using a Stereo Camera to Detect, Track and Intercept a Faster UAV by Reconstructing Its Trajectory
The work presented in this paper demonstrates our approach to intercepting a
faster intruder UAV, inspired by the MBZIRC2020 Challenge 1. By leveraging the
knowledge of the shape of the intruder's trajectory we are able to calculate
the interception point. Target tracking is based on image processing by a
YOLOv3 Tiny convolutional neural network, combined with depth calculation using
a gimbal-mounted ZED Mini stereo camera. We use RGB and depth data from ZED
Mini to extract the 3D position of the target, for which we devise a
histogram-of-depth based processing to reduce noise. Obtained 3D measurements
of target's position are used to calculate the position, the orientation and
the size of a figure-eight shaped trajectory, which we approximate using
lemniscate of Bernoulli. Once the approximation is deemed sufficiently precise,
measured by Hausdorff distance between measurements and the approximation, an
interception point is calculated to position the intercepting UAV right on the
path of the target. The proposed method, which has been significantly improved
based on the experience gathered during the MBZIRC competition, has been
validated in simulation and through field experiments. The results confirmed
that an efficient visual perception module which extracts information related
to the motion of the target UAV as a basis for the interception, has been
developed. The system is able to track and intercept the target which is 30%
faster than the interceptor in majority of simulation experiments. Tests in the
unstructured environment yielded 9 out of 12 successful results.Comment: To be published in Field Robotics. UAV-Eagle dataset available at:
https://github.com/larics/UAV-Eagl
- …