162 research outputs found

    Calibration of non-conventional imaging systems

    Get PDF

    3D Scene Geometry Estimation from 360^\circ Imagery: A Survey

    Full text link
    This paper provides a comprehensive survey on pioneer and state-of-the-art 3D scene geometry estimation methodologies based on single, two, or multiple images captured under the omnidirectional optics. We first revisit the basic concepts of the spherical camera model, and review the most common acquisition technologies and representation formats suitable for omnidirectional (also called 360^\circ, spherical or panoramic) images and videos. We then survey monocular layout and depth inference approaches, highlighting the recent advances in learning-based solutions suited for spherical data. The classical stereo matching is then revised on the spherical domain, where methodologies for detecting and describing sparse and dense features become crucial. The stereo matching concepts are then extrapolated for multiple view camera setups, categorizing them among light fields, multi-view stereo, and structure from motion (or visual simultaneous localization and mapping). We also compile and discuss commonly adopted datasets and figures of merit indicated for each purpose and list recent results for completeness. We conclude this paper by pointing out current and future trends.Comment: Published in ACM Computing Survey

    Modelling and automated calibration of a general multi-projective camera

    Get PDF
    Recently, multi-projective cameras (MPCs), often based on frame-mounted multiple cameras with a small baseline and arbitrary overlap, have found a remarkable place in geomatics and vision-based applications. This paper outlines the geometric calibration of a general MPC by presenting a mathematical model that describes its unknown generic geometry. A modified bundle block adjustment is employed to calibrate an industrial-level 360° non-metric camera. The structure of any MPC can be retrieved as a calibration set of relative and interior orientation parameters (as well as the pose of the MPC shots) using a calibration room which has been accurately determined by close range photogrammetry. To demonstrate the efficiency and precision of the model, a Panono camera (an MPC with 36 individual cameras) was calibrated. After the adjustment, sub-pixel image residuals and acceptable object-space errors were observed.Peer reviewe

    Accurate Calibration Scheme for a Multi-Camera Mobile Mapping System

    Get PDF
    Mobile mapping systems (MMS) are increasingly used for many photogrammetric and computer vision applications, especially encouraged by the fast and accurate geospatial data generation. The accuracy of point position in an MMS is mainly dependent on the quality of calibration, accuracy of sensor synchronization, accuracy of georeferencing and stability of geometric configuration of space intersections. In this study, we focus on multi-camera calibration (interior and relative orientation parameter estimation) and MMS calibration (mounting parameter estimation). The objective of this study was to develop a practical scheme for rigorous and accurate system calibration of a photogrammetric mapping station equipped with a multi-projective camera (MPC) and a global navigation satellite system (GNSS) and inertial measurement unit (IMU) for direct georeferencing. The proposed technique is comprised of two steps. Firstly, interior orientation parameters of each individual camera in an MPC and the relative orientation parameters of each cameras of the MPC with respect to the first camera are estimated. In the second step the offset and misalignment between MPC and GNSS/IMU are estimated. The global accuracy of the proposed method was assessed using independent check points. A correspondence map for a panorama is introduced that provides metric information. Our results highlight that the proposed calibration scheme reaches centimeter-level global accuracy for 3D point positioning. This level of global accuracy demonstrates the feasibility of the proposed technique and has the potential to fit accurate mapping purposes

    Mathematical models for geometric calibration of a hyperemispheric camera for planetary exploration

    Get PDF
    openHyperemispheric lenses belong to the ultra-wide field-of-view optical objectives. The lens considered was firstly introduced in 2018. Its field of view is 360° on the azimuth and 135° for the off-boresight angle. The calibration of the lens consists in computing its extrinsic and intrinsic parameters. This camera is particularly interesting for planetary exploration purposes, since its capability to acquire large field-of-view images avoiding moving parts. The calibration process modifies the toolbox proposed by Davide Scaramuzza, introducing a moving pinhole which better describes the behaviour of the lens. In order to test the model, images were acquired using the OMNICAM lens. The images contain black-and-white checkerboards, whose internal vertices are used as benchmarks to assess the accuracy of the nature of our model.Hyperemispheric lenses belong to the ultra-wide field-of-view optical objectives. The lens considered was firstly introduced in 2018. Its field of view is 360° on the azimuth and 135° for the off-boresight angle. The calibration of the lens consists in computing its extrinsic and intrinsic parameters. This camera is particularly interesting for planetary exploration purposes, since its capability to acquire large field-of-view images avoiding moving parts. The calibration process modifies the toolbox proposed by Davide Scaramuzza, introducing a moving pinhole which better describes the behaviour of the lens. In order to test the model, images were acquired using the OMNICAM lens. The images contain black-and-white checkerboards, whose internal vertices are used as benchmarks to assess the accuracy of the nature of our model

    Global Pose Estimation from Aerial Images : Registration with Elevation Models

    Full text link

    Refractive Geometry for Underwater Domes

    Get PDF
    Underwater cameras are typically placed behind glass windows to protect them from the water. Spherical glass, a dome port, is well suited for high water pressures at great depth, allows for a large field of view, and avoids refraction if a pinhole camera is positioned exactly at the sphere’s center. Adjusting a real lens perfectly to the dome center is a challenging task, both in terms of how to actually guide the centering process (e.g. visual servoing) and how to measure the alignment quality, but also, how to mechanically perform the alignment. Consequently, such systems are prone to being decentered by some offset, leading to challenging refraction patterns at the sphere that invalidate the pinhole camera model. We show that the overall camera system becomes an axial camera, even for thick domes as used for deep sea exploration and provide a non-iterative way to compute the center of refraction without requiring knowledge of exact air, glass or water properties. We also analyze the refractive geometry at the sphere, looking at effects such as forward- vs. backward decentering, iso-refraction curves and obtain a 6th-degree polynomial equation for forward projection of 3D points in thin domes. We then propose a pure underwater calibration procedure to estimate the decentering from multiple images. This estimate can either be used during adjustment to guide the mechanical position of the lens, or can be considered in photogrammetric underwater applications
    corecore