66,657 research outputs found

    Bundle Adjustment for 3-D Reconstruction: Implementation and Evaluation

    Get PDF
    We describe in detail the algorithm of bundle adjustment for 3-D reconstruction from multiple images based on our latest research results. The main focus of this paper is on the handling of camera rotations and the efficiency of computation and memory usage when the number of variables is very large; an appropriate consideration of this is the core of the implementation of bundle adjustment. Computing the fundamental matrix from two views and reconstructing the 3-D structure from multiple views, we evaluate the performance of our algorithm and discuses technical issues of bundle adjustment implementation

    Evaluating tie points distribution, multiplicity and number on the accuracy of UAV photogrammetry blocks

    Get PDF
    Image orientation is a fundamental task in photogrammetric applications and it is performed by extracting keypoints with hand-crafted or learning-based methods, generating tie points among the images and running a bundle adjustment procedure. Nowadays, due to large number of extracted keypoints, tie point filtering approaches attempt to eliminate redundant tie points in order to increase accuracy and reduce processing time. This paper presents the results of an investigation concerning tie points impact on bundle adjustment results. Simulations and real data are processed in Australis and DBAT to evaluate different affecting factors, including tie point numbers, location accuracy, distribution and multiplicity. Achieved results show that increasing the amount of tie points improve the quality of bundle adjustment results, provided that the tie points are well-distributed on the image. Furthermore, bundle adjustment quality is improved as the multiplicity of tie points increases and their location uncertainty decrease. Based on simulation results, some suggestions for accurate tie points filtering in typical UAV photogrammetry blocks cases are derived

    Distributed bundle adjustment with block-based sparse matrix compression for super large scale datasets

    Full text link
    We propose a distributed bundle adjustment (DBA) method using the exact Levenberg-Marquardt (LM) algorithm for super large-scale datasets. Most of the existing methods partition the global map to small ones and conduct bundle adjustment in the submaps. In order to fit the parallel framework, they use approximate solutions instead of the LM algorithm. However, those methods often give sub-optimal results. Different from them, we utilize the exact LM algorithm to conduct global bundle adjustment where the formation of the reduced camera system (RCS) is actually parallelized and executed in a distributed way. To store the large RCS, we compress it with a block-based sparse matrix compression format (BSMC), which fully exploits its block feature. The BSMC format also enables the distributed storage and updating of the global RCS. The proposed method is extensively evaluated and compared with the state-of-the-art pipelines using both synthetic and real datasets. Preliminary results demonstrate the efficient memory usage and vast scalability of the proposed method compared with the baselines. For the first time, we conducted parallel bundle adjustment using LM algorithm on a real datasets with 1.18 million images and a synthetic dataset with 10 million images (about 500 times that of the state-of-the-art LM-based BA) on a distributed computing system.Comment: camera ready version for ICCV202
    • …
    corecore