5 research outputs found

    A Co-optimal Coverage Path Planning Method for Aerial Scanning of Complex Structures

    Get PDF
    The utilization of unmanned aerial vehicles (UAVs) in survey and inspection of civil infrastructure has been growing rapidly. However, computationally efficient solvers that find optimal flight paths while ensuring high-quality data acquisition of the complete 3D structure remains a difficult problem. Existing solvers typically prioritize efficient flight paths, or coverage, or reducing computational complexity of the algorithm – but these objectives are not co-optimized holistically. In this work we introduce a co-optimal coverage path planning (CCPP) method that simultaneously co-optimizes the UAV path, the quality of the captured images, and reducing computational complexity of the solver all while adhering to safety and inspection requirements. The result is a highly parallelizable algorithm that produces more efficient paths where quality of the useful image data is improved. The path optimization algorithm utilizes a particle swarm optimization (PSO) framework which iteratively optimizes the coverage paths without needing to discretize the motion space or simplify the sensing models as is done in similar methods. The core of the method consists of a cost function that measures both the quality and efficiency of a coverage inspection path, and a greedy heuristic for the optimization enhancement by aggressively exploring the viewpoints search spaces. To assess the proposed method, a coverage path quality evaluation method is also presented in this research, which can be utilized as the benchmark for assessing other CPP methods for structural inspection purpose. The effectiveness of the proposed method is demonstrated by comparing the quality and efficiency of the proposed approach with the state-of-art through both synthetic and real-world scenes. The experiments show that our method enables significant performance improvement in coverage inspection quality while preserving the path efficiency on different test geometries

    Prioritized Multi-View Stereo Depth Map Generation Using Confidence Prediction

    Get PDF
    In this work, we propose a novel approach to prioritize the depth map computation of multi-view stereo (MVS) to obtain compact 3D point clouds of high quality and completeness at low computational cost. Our prioritization approach operates before the MVS algorithm is executed and consists of two steps. In the first step, we aim to find a good set of matching partners for each view. In the second step, we rank the resulting view clusters (i.e. key views with matching partners) according to their impact on the fulfillment of desired quality parameters such as completeness, ground resolution and accuracy. Additional to geometric analysis, we use a novel machine learning technique for training a confidence predictor. The purpose of this confidence predictor is to estimate the chances of a successful depth reconstruction for each pixel in each image for one specific MVS algorithm based on the RGB images and the image constellation. The underlying machine learning technique does not require any ground truth or manually labeled data for training, but instead adapts ideas from depth map fusion for providing a supervision signal. The trained confidence predictor allows us to evaluate the quality of image constellations and their potential impact to the resulting 3D reconstruction and thus builds a solid foundation for our prioritization approach. In our experiments, we are thus able to reach more than 70% of the maximal reachable quality fulfillment using only 5% of the available images as key views. For evaluating our approach within and across different domains, we use two completely different scenarios, i.e. cultural heritage preservation and reconstruction of single family houses.Comment: This paper was accepted to ISPRS Journal of Photogrammetry and Remote Sensing (https://www.journals.elsevier.com/isprs-journal-of-photogrammetry-and-remote-sensing) on March 21, 2018. The official version will be made available on ScienceDirect (https://www.sciencedirect.com
    corecore