350 research outputs found
Image intensifier characterization
An image intensifier forms an integral part of a full-field image range finder under development at the University of Waikato. Operating as a high speed shutter with repetition rates up to 100 MHz, a method is described to characterise the response, both temporally and spatially, of the intensifier in order to correct for variations in the field of view and to optimise the operating conditions. A short pulse of visible light is emitted by a laser diode, uniformly illuminating the image intensifier, while a CCD camera captures the output from the intensifier. The phase of the laser pulse is continuously varied using a heterodyne configuration, automatically producing a set of samples covering the modulation cycle. The results show some anomalies in the response of our system and some simple solutions are proposed to correct for these
A fast Maximum Likelihood method for improving AMCW lidar precision using waveform shape
Amplitude Modulated Continuous Wave imaging lidar systems use the time-of-flight principle to determine the range to objects in a scene. Typical systems use modulated illumination of a scene and a modulated sensor or image intensifier. By changing the relative phase of the two modulation signals it is possible to measure the phase shift induced in the illumination signal, thus the range to the scene. In practical systems, the resultant correlation waveform contains harmonics that typically result in a non-linear range response. Nevertheless, these harmonics can be used to improve range precision. We model a waveform continuously variable in phase and intensity as a linear interpolation. By approximating the problem as a Maximum Likelihood problem, an analytic solution for the problem is derived that enables an entire range image to be processed in a few seconds. A substantial improvement in overall RMS error and precision over the standard Fourier phase analysis approach results
Resolving depth measurement ambiguity with commercially available range imaging cameras
Time-of-flight range imaging is typically performed with the amplitude modulated continuous wave method. This involves illuminating a scene with amplitude modulated light. Reflected light from the scene is received by the sensor with the range to the scene encoded as a phase delay of the modulation envelope. Due to the cyclic nature of phase, an ambiguity in the measured range occurs every half wavelength in distance, thereby limiting the maximum useable range of the camera.
This paper proposes a procedure to resolve depth ambiguity using software post processing. First, the range data is processed to segment the scene into separate objects. The average intensity of each object can then be used to determine which pixels are beyond the non-ambiguous range. The results demonstrate that depth ambiguity can be resolved for various scenes using only the available depth and intensity information. This proposed method reduces the sensitivity to objects with very high and very low reflectance, normally a key problem with basic threshold approaches.
This approach is very flexible as it can be used with any range imaging camera. Furthermore, capture time is not extended, keeping the artifacts caused by moving objects at a minimum. This makes it suitable for applications such as robot vision where the camera may be moving during captures.
The key limitation of the method is its inability to distinguish between two overlapping objects that are separated by a distance of exactly one non-ambiguous range. Overall the reliability of this method is higher than the basic threshold approach, but not as high as the multiple frequency method of resolving ambiguity
Illumination waveform optimization for time-of-flight range imaging cameras
Time-of-flight range imaging sensors acquire an image of a scene, where in addition to standard intensity information, the range (or distance) is also measured concurrently by each pixel. Range is measured using a correlation technique, where an amplitude modulated light source illuminates the scene and the reflected light is sampled by a gain modulated image sensor. Typically the illumination source and image sensor are amplitude modulated with square waves, leading to a range measurement linearity error caused by aliased harmonic components within the correlation waveform. A simple method to improve measurement linearity by reducing the duty cycle of the illumination waveform to suppress problematic aliased harmonic components is demonstrated. If the total optical power is kept constant, the measured correlation waveform amplitude also increases at these reduced illumination duty cycles. Measurement performance is evaluated over a range of illumination duty cycles, both for a standard range imaging camera configuration, and also using a more complicated phase encoding method that is designed to cancel aliased harmonics during the sampling process. The standard configuration benefits from improved measurement linearity for illumination duty cycles around 30%, while the measured amplitude, hence range precision, is increased for both methods as the duty cycle is reduced below 50% (while maintaining constant optical power)
A synchronised Direct Digital Synthesiser
We describe a Direct Digital Synthesiser (DDS) which provides three frequency-locked synchronised outputs to generate frequencies from DC to 160 MHz. Primarily designed for use in a heterodyning range imaging system, the flexibility of the design allows its use in a number of other applications which require any number of stable, synchronised high frequency outputs. Frequency tuning of 32 bit length provides 0.1 Hz resolution when operating at the maximum clock rate of 400 MSPS, while 14 bit phase tuning provides 0.4 mrad resolution. The DDS technique provides very high relative accuracy between outputs, while the onboard oscillator’s stability of ±1 ppm adds absolute accuracy to the design
Separating true range measurements from multi-path and scattering interference in commercial range cameras
Time-of-flight range cameras acquire a three-dimensional image of a scene simultaneously for all pixels from a single viewing location. Attempts to use range cameras for metrology applications have been hampered by the multi-path problem, which causes range distortions when stray light interferes with the range measurement in a given pixel. Correcting multi-path distortions by post-processing the three-dimensional measurement data has been investigated, but enjoys limited success because the interference is highly scene dependent. An alternative approach based on separating the strongest and weaker sources of light returned to each pixel, prior to range decoding, is more successful, but has only been demonstrated on custom built range cameras, and has not been suitable for general metrology applications. In this paper we demonstrate an algorithm applied to both the Mesa Imaging SR-4000 and Canesta Inc. XZ-422 Demonstrator unmodified off-the-shelf range cameras. Additional raw images are acquired and processed using an optimization approach, rather than relying on the processing provided by the manufacturer, to determine the individual component returns in each pixel. Substantial improvements in accuracy are observed, especially in the darker regions of the scene
A power-saving modulation technique for time-of-flight range imaging sensors
Time-of-flight range imaging cameras measure distance and intensity simultaneously for every pixel in an image. With the continued advancement of the technology, a wide variety of new depth sensing applications are emerging; however a number of these potential applications have stringent electrical power constraints that are difficult to meet with the current state-of-the-art systems. Sensor gain modulation contributes a significant proportion of the total image sensor power consumption, and as higher spatial resolution range image sensors operating at higher modulation frequencies (to achieve better measurement precision) are developed, this proportion is likely to increase. The authors have developed a new sensor modulation technique using resonant circuit concepts that is more power efficient than the standard mode of operation. With a proof of principle system, a 93–96% reduction in modulation drive power was demonstrated across a range of modulation frequencies from 1–11 MHz. Finally, an evaluation of the range imaging performance revealed an improvement in measurement linearity in the resonant configuration due primarily to the more sinusoidal shape of the resonant electrical waveforms, while the average precision values were comparable between the standard and resonant operating modes
Heterodyne range imaging as an alternative to photogrammetry
Solid-state full-field range imaging technology, capable of determining the distance to objects in a scene simultaneously for every pixel in an image, has recently achieved sub-millimeter distance measurement precision. With this level of precision, it is becoming practical to use this technology for high precision three-dimensional metrology applications. Compared to photogrammetry, range imaging has the advantages of requiring only one viewing angle, a relatively short measurement time, and simplistic fast data processing. In this paper we fist review the range imaging technology, then describe an experiment comparing both photogrammetric and range imaging measurements of a calibration block with attached retro-reflective targets. The results show that the range imaging approach exhibits errors of approximately 0.5 mm in-plane and almost 5 mm out-of-plane; however, these errors appear to be mostly systematic. We then proceed to examine the physical nature and characteristics of the image ranging technology and discuss the possible causes of these systematic errors. Also discussed is the potential for further system characterization and calibration to compensate for the range determination and other errors, which could possibly lead to three-dimensional measurement precision approaching that of photogrammetry
Characterizing an image intensifier in an full-field range image system
We are developing a high precision full-field range imaging system. An integral component in this system is an image intensifier, which is modulated at frequencies up to 100 MHz. The range measurement precision is dictated by the image intensifier performance, in particular, the achievable modulation frequency, modulation depth, and waveform shape. By characterizing the image intensifier response, undesirable effects can be observed and quantified with regards to the consequence on the resulting range measurements, and the optimal operating conditions can be selected to minimize these disturbances. The characterization process utilizes a pulsed laser source to temporally probe the gain of the image intensifier. The laser is pulsed at a repetition rate slightly different to the image intensifier modulation frequency, producing a continuous phase shift between the two signals. A charge coupled device samples the image intensifier output, capturing the response over a complete modulation period. Deficiencies in our measured response are clearly identifiable and simple modifications to the configuration of our electrical driver circuit improve the modulation performance
The Waikato range imager
We are developing a high precision simultaneous full-field acquisition range imager. This device measures range with sub millimetre precision in range simultaneously over a full-field view of the scene. Laser diodes are used to illuminate the scene with amplitude modulation with a frequency of 10MHz up to 100 MHz. The received light is interrupted by a high speed shutter operating in a heterodyne configuration thus producing a low-frequency signal which is sampled with a digital camera. By detecting the phase of the signal at each pixel the range to the scene is determined. We show 3D reconstructions of some viewed objects to demonstrate the capabilities of the ranger
- …
