3,842 research outputs found

    Transfer Learning-Based Crack Detection by Autonomous UAVs

    Full text link
    Unmanned Aerial Vehicles (UAVs) have recently shown great performance collecting visual data through autonomous exploration and mapping in building inspection. Yet, the number of studies is limited considering the post processing of the data and its integration with autonomous UAVs. These will enable huge steps onward into full automation of building inspection. In this regard, this work presents a decision making tool for revisiting tasks in visual building inspection by autonomous UAVs. The tool is an implementation of fine-tuning a pretrained Convolutional Neural Network (CNN) for surface crack detection. It offers an optional mechanism for task planning of revisiting pinpoint locations during inspection. It is integrated to a quadrotor UAV system that can autonomously navigate in GPS-denied environments. The UAV is equipped with onboard sensors and computers for autonomous localization, mapping and motion planning. The integrated system is tested through simulations and real-world experiments. The results show that the system achieves crack detection and autonomous navigation in GPS-denied environments for building inspection

    Bridge Inspection: Human Performance, Unmanned Aerial Systems and Automation

    Get PDF
    Unmanned aerial systems (UASs) have become of considerable private and commercial interest for a variety of jobs and entertainment in the past 10 years. This paper is a literature review of the state of practice for the United States bridge inspection programs and outlines how automated and unmanned bridge inspections can be made suitable for present and future needs. At its best, current technology limits UAS use to an assistive tool for the inspector to perform a bridge inspection faster, safer, and without traffic closure. The major challenges for UASs are satisfying restrictive Federal Aviation Administration regulations, control issues in a GPS-denied environment, pilot expenses and availability, time and cost allocated to tuning, maintenance, post-processing time, and acceptance of the collected data by bridge owners. Using UASs with self-navigation abilities and improving image-processing algorithms to provide results near real-time could revolutionize the bridge inspection industry by providing accurate, multi-use, autonomous three-dimensional models and damage identification

    A Systematic Literature Survey of Unmanned Aerial Vehicle Based Structural Health Monitoring

    Get PDF
    Unmanned Aerial Vehicles (UAVs) are being employed in a multitude of civil applications owing to their ease of use, low maintenance, affordability, high-mobility, and ability to hover. UAVs are being utilized for real-time monitoring of road traffic, providing wireless coverage, remote sensing, search and rescue operations, delivery of goods, security and surveillance, precision agriculture, and civil infrastructure inspection. They are the next big revolution in technology and civil infrastructure, and it is expected to dominate more than $45 billion market value. The thesis surveys the UAV assisted Structural Health Monitoring or SHM literature over the last decade and categorize UAVs based on their aerodynamics, payload, design of build, and its applications. Further, the thesis presents the payload product line to facilitate the SHM tasks, details the different applications of UAVs exploited in the last decade to support civil structures, and discusses the critical challenges faced in UASHM applications across various domains. Finally, the thesis presents two artificial neural network-based structural damage detection models and conducts a detailed performance evaluation on multiple platforms like edge computing and cloud computing

    FlightGoggles: A Modular Framework for Photorealistic Camera, Exteroceptive Sensor, and Dynamics Simulation

    Full text link
    FlightGoggles is a photorealistic sensor simulator for perception-driven robotic vehicles. The key contributions of FlightGoggles are twofold. First, FlightGoggles provides photorealistic exteroceptive sensor simulation using graphics assets generated with photogrammetry. Second, it provides the ability to combine (i) synthetic exteroceptive measurements generated in silico in real time and (ii) vehicle dynamics and proprioceptive measurements generated in motio by vehicle(s) in a motion-capture facility. FlightGoggles is capable of simulating a virtual-reality environment around autonomous vehicle(s). While a vehicle is in flight in the FlightGoggles virtual reality environment, exteroceptive sensors are rendered synthetically in real time while all complex extrinsic dynamics are generated organically through the natural interactions of the vehicle. The FlightGoggles framework allows for researchers to accelerate development by circumventing the need to estimate complex and hard-to-model interactions such as aerodynamics, motor mechanics, battery electrochemistry, and behavior of other agents. The ability to perform vehicle-in-the-loop experiments with photorealistic exteroceptive sensor simulation facilitates novel research directions involving, e.g., fast and agile autonomous flight in obstacle-rich environments, safe human interaction, and flexible sensor selection. FlightGoggles has been utilized as the main test for selecting nine teams that will advance in the AlphaPilot autonomous drone racing challenge. We survey approaches and results from the top AlphaPilot teams, which may be of independent interest.Comment: Initial version appeared at IROS 2019. Supplementary material can be found at https://flightgoggles.mit.edu. Revision includes description of new FlightGoggles features, such as a photogrammetric model of the MIT Stata Center, new rendering settings, and a Python AP

    Aerial Vehicle Tracking by Adaptive Fusion of Hyperspectral Likelihood Maps

    Full text link
    Hyperspectral cameras can provide unique spectral signatures for consistently distinguishing materials that can be used to solve surveillance tasks. In this paper, we propose a novel real-time hyperspectral likelihood maps-aided tracking method (HLT) inspired by an adaptive hyperspectral sensor. A moving object tracking system generally consists of registration, object detection, and tracking modules. We focus on the target detection part and remove the necessity to build any offline classifiers and tune a large amount of hyperparameters, instead learning a generative target model in an online manner for hyperspectral channels ranging from visible to infrared wavelengths. The key idea is that, our adaptive fusion method can combine likelihood maps from multiple bands of hyperspectral imagery into one single more distinctive representation increasing the margin between mean value of foreground and background pixels in the fused map. Experimental results show that the HLT not only outperforms all established fusion methods but is on par with the current state-of-the-art hyperspectral target tracking frameworks.Comment: Accepted at the International Conference on Computer Vision and Pattern Recognition Workshops, 201

    Comparison of Human Pilot (Remote) Control Systems in Multirotor Unmanned Aerial Vehicle Navigation

    Get PDF
    This paper concerns about the human pilot or remote control system in UAV navigation. Demands for Unmanned Aerial Vehicle (UAV) are increasing tremendously in aviation industry and research area. UAV is a flying machine that can fly with no pilot onboard and can be controlled by ground-based operators. In this paper, a comparison was made between different proposed remote control systems and devices to navigate multirotor UAV, like hand-controllers, gestures and body postures techniques, and vision-based techniques. The overall reviews discussed in this paper have been studied in various research sources related to UAV and its navigation system. Every method has its pros and cons depends on the situation. At the end of the study, those methods will be analyzed and the best method will be chosen in term of accuracy and efficiency

    Development of Cursor-on-Target Control for Semi-Autonomous Unmanned Aircraft Systems

    Get PDF
    The research presented in this thesis focuses on developing, demonstrating, and evaluating the concept of a Cursor-on-Target control system for semi-autonomous unmanned aircraft systems. The Department of Defense has mapped out a strategy in which unmanned aircraft systems will increasingly replace piloted aircraft. During most phases of flight autonomous unmanned aircraft control reduces operator workload, however, real-time information exchange often requires an operator to relay decision changes to the unmanned aircraft. The goal of this research is to develop a preliminary Cursor-on-Target control system to enable the operator to guide the unmanned aircraft with minimal workload during high task phases of flight and then evaluate the operator\u27s ability to conduct the mission using that control system. For this research, the problem of Cursor-on-Target control design has multiple components. Initially, a Cursor-on-Target controller is developed in Simulink. Then, this controller is integrated into the Aviator Visual Design Simulator to develop an operator-in-the-loop test platform. Finally, a ground target is simulated and tracked to validate the Cursor-on-Target controller. The Cursor-on-Target control system is then evaluated using a proposed operator rating scale
    • …
    corecore