72,854 research outputs found

    The importance of camera calibration and distortion correction to obtain measurements with video surveillance systems

    Get PDF
    Video surveillance systems are commonly used as important sources of quantitative information but from the acquired images it is possible to obtain a large amount of metric information. Yet, different methodological issues must be considered in order to perform accurate measurements using images. The most important one is the camera calibration, which is the estimation of the parameters defining the camera model. One of the most used camera calibration method is the Zhang's method, that allows the estimation of the linear parameters of the camera model. This method is very diffused as it requires a simple setup and it allows to calibrate cameras using a simple and fast procedure, but it does not consider lenses distortions, that must be taken into account with short focal lenses, commonly used in video surveillance systems. In order to perform accurate measurements, the linear camera model and the Zhang's method are improved in order to take nonlinear parameters into account and compensate the distortion contribute. In this paper we first describe the pinhole camera model that considers cameras as central projection systems. After a brief introduction to the camera calibration process and in particular the Zhang's method, we give a description of the different types of lens distortions and the techniques used for the distortion compensation. At the end some numerical example are shown in order to demonstrate the importance of the distortion compensation to obtain accurate measurements

    The construction of a highly transportable laser ranging station

    Get PDF
    The technology of the transportable Laser Ranging Station (TLRS) used in crustal dynamics studies was examined. The TLRS used a single photoelectron beam of limited energy density returned from the Laser Geodynamic Satellite (LAGEOS). Calibration was accomplished by the diversion of a small portion of the outgoing beam attenuated to the same level as the satellite return. Timing for the system was based on a self calibrating Ortec TD811, 100 picosec time interval device. The system was contained in a modified, single chassis recreational vehicle that allowed rapid deployment. The TLRS system was only airmobile on the largest transport aircraft. A 30 cm simple plano/concave transfer lens telescope aided in beam direction. The TLRS system fulfills the need for an accurate method of obtaining range measurements to the LAGEOS satellite incorporated in a mobile, air transportable, and economical configuration

    Easy Leaf Area: Automated digital image analysis for rapid and accurate measurement of leaf area.

    Get PDF
    UnlabelledPremise of the studyMeasurement of leaf areas from digital photographs has traditionally required significant user input unless backgrounds are carefully masked. Easy Leaf Area was developed to batch process hundreds of Arabidopsis rosette images in minutes, removing background artifacts and saving results to a spreadsheet-ready CSV file. •Methods and resultsEasy Leaf Area uses the color ratios of each pixel to distinguish leaves and calibration areas from their background and compares leaf pixel counts to a red calibration area to eliminate the need for camera distance calculations or manual ruler scale measurement that other software methods typically require. Leaf areas estimated by this software from images taken with a camera phone were more accurate than ImageJ estimates from flatbed scanner images. •ConclusionsEasy Leaf Area provides an easy-to-use method for rapid measurement of leaf area and nondestructive estimation of canopy area from digital images

    Positioning in time and space: cost-effective exterior orientation for airborne archaeological photographs

    Get PDF
    Since manned, airborne aerial reconnaissance for archaeological purposes is often characterised by more-or-less random photographing of archaeological features on the Earth, the exact position and orientation of the camera during image acquisition becomes very important in an effective inventorying and interpretation workflow of these aerial photographs. Although the positioning is generally achieved by simultaneously logging the flight path or directly recording the camera's position with a GNSS receiver, this approach does not allow to record the necessary roll, pitch and yaw angles of the camera. The latter are essential elements for the complete exterior orientation of the camera, which allows – together with the inner orientation of the camera – to accurately define the portion of the Earth recorded in the photograph. This paper proposes a cost-effective, accurate and precise GNSS/IMU solution (image position: 2.5 m and orientation: 2°, both at 1σ) to record all essential exterior orientation parameters for the direct georeferencing of the images. After the introduction of the utilised hardware, this paper presents the developed software that allows recording and estimating these parameters. Furthermore, this direct georeferencing information can be embedded into the image's metadata. Subsequently, the first results of the estimation of the mounting calibration (i.e. the misalignment between the camera and GNSS/IMU coordinate frame) are provided. Furthermore, a comparison with a dedicated commercial photographic GNSS/IMU solution will prove the superiority of the introduced solution. Finally, an outlook on future tests and improvements finalises this article

    Camera distortion self-calibration using the plumb-line constraint and minimal Hough entropy

    Full text link
    In this paper we present a simple and robust method for self-correction of camera distortion using single images of scenes which contain straight lines. Since the most common distortion can be modelled as radial distortion, we illustrate the method using the Harris radial distortion model, but the method is applicable to any distortion model. The method is based on transforming the edgels of the distorted image to a 1-D angular Hough space, and optimizing the distortion correction parameters which minimize the entropy of the corresponding normalized histogram. Properly corrected imagery will have fewer curved lines, and therefore less spread in Hough space. Since the method does not rely on any image structure beyond the existence of edgels sharing some common orientations and does not use edge fitting, it is applicable to a wide variety of image types. For instance, it can be applied equally well to images of texture with weak but dominant orientations, or images with strong vanishing points. Finally, the method is performed on both synthetic and real data revealing that it is particularly robust to noise.Comment: 9 pages, 5 figures Corrected errors in equation 1

    Active Estimation of Distance in a Robotic Vision System that Replicates Human Eye Movement

    Full text link
    Many visual cues, both binocular and monocular, provide 3D information. When an agent moves with respect to a scene, an important cue is the different motion of objects located at various distances. While a motion parallax is evident for large translations of the agent, in most head/eye systems a small parallax occurs also during rotations of the cameras. A similar parallax is present also in the human eye. During a relocation of gaze, the shift in the retinal projection of an object depends not only on the amplitude of the movement, but also on the distance of the object with respect to the observer. This study proposes a method for estimating distance on the basis of the parallax that emerges from rotations of a camera. A pan/tilt system specifically designed to reproduce the oculomotor parallax present in the human eye was used to replicate the oculomotor strategy by which humans scan visual scenes. We show that the oculomotor parallax provides accurate estimation of distance during sequences of eye movements. In a system that actively scans a visual scene, challenging tasks such as image segmentation and figure/ground segregation greatly benefit from this cue.National Science Foundation (BIC-0432104, CCF-0130851
    • …
    corecore