27 research outputs found

    Flexible Photogrammetric Computations Using Modular Bundle Adjustment: The Chain Rule and the Collinearity Equations

    Get PDF
    International audienceThe main purpose of this paper is to show that photogrammetric bundle adjustment computations can be sequentially organized into modules. Furthermore, the chain rule can be used to simplify the computation of the analytical Jacobians needed by the adjustment. Novel projection models can be flexibly evaluated by inserting, modifying, or swapping the order of selected modules. As a proof of concept, two variants of the pin-hole projection model with Brown lens distortion were implemented in the open-source Damped Bundle Adjustment Toolbox (DBAT) and applied to simulated and calibration data for a non-conventional lens system. The results show a significant difference for the simulated, error-free, data but not for the real calibration data. The current flexible implementation incurs a performance loss. However, in cases where flexibility is more important, the modular formulation should be a useful tool to investigate novel sensors, data processing techniques, and refractive models

    Accuracy of radiographic and radiostereometric wear measurement of different hip prostheses: an experimental study.

    Get PDF
    Background In vivo measurement of wear in the ball and socket articulation of total hip arthroplasties is of interest in the evaluation of both existing and new implants. Controversy reigns regarding the accuracy of different radiological measurement techniques and in particular how accuracy has been assessed. Material and methods We assessed the accuracy of 2 radiostereometric (RSA) techniques for wear measurement and 3 standard radiographic techniques, namely Imagika (image analyzing software), Imagika corrected for head center displacement, and the Charnley Duo method. 5 custom-made adjustable phantoms with different prosthetic components were used. Results In 20 measurements of all 5 phantoms at 3 levels of simulated wear (0.2 mm, 1.0 mm and 1.5 mm), the mean measurement error of the digital RSA examinations was 0.010 mm (accuracy 0.42). The corresponding error values for the three radiographic techniques were 0.19 (accuracy 1.3) for Charnley Duo, 0.13 (accuracy 1.3) for Imagika corrected, and 1.021 (accuracy 2.99) for Imagika. Measurement error decreased from 0.011 mm with ordinary RSA to 0.004 with RSA digital measurement. Head size, direction of wear in relation to the cup or type of prosthetic component did not influence the measurement error. The results of Charnley Duo and Imagika corrected were similar but the latter had an inexplicable systematic error in measuring one of the phantoms. Imagika had the worst results due to its inability to compensate for the out-of-head center effect. Alumina heads were difficult to analyze with all methods. Interpretation By using the ISO standard for assessing accuracy, RSA can be expected to measure wear with an accuracy of about 0.4 mm irrespective of prosthetic component studied or direction of wear, whereas the best technique, in our study, based on standard radiographs can be accurate to about 1.3 mm

    Adaptive least squares matching as a non-linear least squares optimization problem

    No full text
    Adaptive Least Squares Matching (ALSM) is a powerful technique for precisely locating objects in digital images. The method was introduced to the photogrammetric community by Gruen in 1985 and has since been developed further. The purpose of this paper is to study the basic ALSM formulation from a least squares optimization point of view. It turns out that it is possible to describe the basic algorithm as a variation of the Gauss-Newton method for solving weighted non-linear least squares optimization problems. This opens the possibility of applying optimization theory on the ALSM problem. The line-search algorithm for obtaining global convergence is especially described and illustrate

    Improving the robustness of least squares template matching with a line-search algorithm

    No full text
    The Adaptive Least Squares Matching (ALSM) problem of Gruen is conventionally described as a statistical estimation problem. This paper shows that the ALSM problem may also be interpreted as a weighted non-linear least squares problem. This enables optimization theory to be applied to the ALSM problem. The ALSM algorithm may be interpreted as an instance of the well-known Gauss-Newton algorithm. A problem-independent termination criteria is introduces based on angles in high-dimensional vector spaces. The line-search modification of the Gauss-Newton method is explained and applied to the ALSM problem. The implications of the line-search modification is an increased robustness, reduced oscillations, and increased pull-in range. A potential drawback is the increased number of convergences toward side minima in images with repeating patterns

    Bundle adjustment with and without damping

    No full text
    The least squares adjustment (LSA) method is studied as an optimisation problem and shown to be equivalent to the undamped Gauss-Newton (GN) optimisation method. Three problem-independent damping modifications of the GN method are presented: the line-search method of Armijo (GNA); the Levenberg-Marquardt algorithm (LM); and Levenberg-Marquardt-Powell (LMP). Furthermore, an additional problem-specific "veto" damping technique, based on the chirality condition, is suggested. In a perturbation study on a terrestrial bundle adjustment problem the GNA and LMP methods with veto damping can increase the size of the pull-in region compared to the undamped method; the LM method showed less improvement. The results suggest that damped methods can, in many cases, provide a solution where undamped methods fail and should be available in any LSA software package. Matlab code for the algorithms discussed is available from the authors

    Camera Calibration using the Damped Bundle Adjustment Toolbox

    No full text
    Camera calibration is one of the fundamental photogrammetric tasks. The standard procedure is to apply an iterative adjustment to measurements of known control points. The iterative adjustment needs initial values of internal and external parameters. In this paper we investigate a procedure where only one parameter - the focal length is given a specific initial value. The procedure is validated using the freely available Damped Bundle Adjustment Toolbox on five calibration data sets using varying narrow- and wide-angle lenses. The results show that the Gauss-Newton-Armijo and Levenberg-Marquardt-Powell bundle adjustment methods implemented in the toolbox converge even if the initial values of the focal length are between 1/2 and 32 times the true focal length, even if the parameters are highly correlated. Standard statistical analysis methods in the toolbox enable manual selection of the lens distortion parameters to estimate, something not available in other camera calibration toolboxes. A standardised camera calibration procedure that does not require any information about the camera sensor or focal length is suggested based on the convergence results. The toolbox source and data sets used in this paper are available from the authors

    Experiments with Metadata-derived Initial Values and Linesearch Bundle Adjustment in Architectural Photogrammetry

    No full text
    According to the WaldhÀusl and Ogleby (1994) "3x3 rules", a well-designed close-range architetural photogrammetric project should include a sketch of the project site with the approximate position and viewing direction of each image. This orientation metadata is important to determine which part of the object each image covers. In principle, the metadata could be used as initial values for the camera external orientation (EO) parameters. However, this has rarely been used, partly due to convergence problem for the bundle adjustment procedure. In this paper we present a photogrammetric reconstruction pipeline based on classical methods and investigate if and how the linesearch bundle algorithms of Börlin et al. (2004) and/or metadata can be used to aid the reconstruction process in architectural photogrammetry when the classical methods fail. The primary initial values for the bundle are calculated by the five-point algorithm by Nistér (Stewénius et al., 2006). Should the bundle fail, initial values derived from metadata are calculated and used for a second bundle attempt. The pipeline was evaluated on an image set of the INSA building in Strasbourg. The data set includes mixed convex and non-convex subnetworks and a combination of manual and automatic measurements. The results show that, in general, the classical bundle algorithm with five-point initial values worked well. However, in cases where it did fail, linesearch bundle and/or metadata initial values did help. The presented approach is interesting for solving EO problems when the automatic orientation processes fail as well as to simplify keeping a link between the metadata containing the plan of how the project should have become and the actual reconstructed network as it turned out to be

    3D measurements of buildings and environment for harbor simulators

    No full text
    Oryx Simulations develops and manufactures real-time physics simulators for training of harbor crane operator in several of the world’s major harbors. Currently, the modelling process is labor-intensive and a faster solution that can produce accurate, textured models of harbor scenes is desired. The accuracy requirements vary across the scene, and in some areas accuracy can be traded for speed. Due to the heavy equipment involved, reliable error estimates are important throughout the scene. This report surveys the scientific literature of 3D reconstruction algorithms from aerial and terrestrial imagery and laser scanner data. Furthermore, available software solutions are evaluated. The conclusion is that the most useful data source is terrestrial images, optionally complemented by terrestrial laser scanning. Although robust, automatic algorithms exist for several low-level subproblems, no automatic high-level 3D modelling algorithm exists that satisfy all the requirements. Instead, the most successful high-level methods are semi-automatic, and their respective success depend on how well user input is incorporated into an efficient workflow. Furthermore, the conclusion is that existing software cannot handle the full suite of varying requirements within the harbor reconstruction problem. Instead we suggest that a 3D reconstruction toolbox is implemented in a high-level language, Matlab. The toolbox should contain state-of-the-art low-level algorithms that can be used as “building blocks” in automatic or semi-automatic higher-level algorithms. All critical algorithms must produce reliable error estimates. The toolbox approach in Matlab will be able to simultaneously support basic research of core algorithms, evaluation of problem-specific high-level algorithms, and production of industry-grade solutions that can be ported to other programming languages and environments

    Photogrammetric calibration of image sequences acquired with a rotating camera

    No full text
    This paper reports theory and examples about the calibration and orientation of fixed but freely rotating cameras with possible changes of the interior parameters. We consider cameras that are generally rotating, without any special adapter to remove the eccentricity between perspective center and rotation axis. That is the typical case of surveillance cameras or sport videos. Projective and perspective camera model are analyzed and between the reported examples, self-acquired images and old monocular videos of sports events are considered. We also show the possibility to achieve 3D object reconstruction using rotating cameras. Finally we will report the mosaic generation from images acquired with a rotating system

    External Verification of the Bundle Adjustment in Photogrammetric Software Using the Damped Bundle Adjustment Toolbox

    No full text
    The aim of this paper is to investigate whether the Matlab-based Damped Bundle Adjustment Toolbox (DBAT) can be used to provide independent verification of the BA computation of two popular software—PhotoModeler (PM) and PhotoScan (PS). For frame camera data sets with lens distortion, DBAT is able to reprocess and replicate subsets of PM results with high accuracy. For lens-distortion-free data sets, DBAT can furthermore provide comparative results between PM and PS. Data sets for the discussed projects are available from the authors. The use of an external verification tool such as DBAT will enable users to get an independent verification of the computations of their software. In addition, DBAT can provide computation of quality parameters such as estimated standard deviations, correlation between parameters, etc., something that should be part of best practice for any photogrammetric software. Finally, as the code is free and open-source, users can add computations of their own
    corecore