3,109 research outputs found
A high-resolution full-field range imaging system
There exist a number of applications where the range to all objects in a field of view needs to be obtained. Specific examples include obstacle avoidance for autonomous mobile robots, process automation in assembly factories, surface profiling for shape analysis, and surveying. Ranging systems can be typically characterized as being either laser scanning systems where a laser point is sequentially scanned over a scene or a full-field acquisition where the range to every point in the image is simultaneously obtained. The former offers advantages in terms of range resolution, while the latter tend to be faster and involve no moving parts. We present a system for determining the range to any object within a camera's field of view, at the speed of a full-field system and the range resolution of some point laser scans. Initial results obtained have a centimeter range resolution for a 10 second acquisition time. Modifications to the existing system are discussed that should provide faster results with submillimeter resolution
Volume measurement using 3D Range Imaging
The use of 3D Range Imaging has widespread applications. One of its applications provides us the information about the volumes of different objects. In this paper, 3D range imaging has been utilised to find out the volumes of different objects using two algorithms that are based on a straightforward means to calculate volume. The algorithms implemented succesfully calculate volume on objects provided that the objects have uniform colour. Objects that have multi-coloured and glossy surfaces provided particular difficulties in determining volume
Closed-form inverses for the mixed pixel/multipath interference problem in AMCW lidar
We present two new closed-form methods for mixed pixel/multipath interference separation in AMCW lidar systems. The mixed pixel/multipath interference problem arises from the violation of a standard range-imaging assumption that each pixel integrates over only a single, discrete backscattering source. While a numerical inversion method has previously been proposed, no close-form inverses have previously been posited. The first new method models reflectivity as a Cauchy distribution over range and uses four measurements at different modulation frequencies to determine the amplitude, phase and reflectivity distribution of up to two component returns within each pixel. The second new method uses attenuation ratios to determine the amplitude and phase of up to two component returns within each pixel. The methods are tested on both simulated and real data and shown to produce a significant improvement in overall error. While this paper focusses on the AMCW mixed pixel/multipath interference problem, the algorithms contained herein have applicability to the reconstruction of a sparse one dimensional signal from an extremely limited number of discrete samples of its Fourier transform
Analysis of ICP variants for the registration of partially overlapping time-of-flight range images
The iterative closest point (ICP) algorithm is one of the most commonly used methods for registering partially overlapping range images. Nevertheless, this algorithm was not originally designed for this task, and many variants have been proposed in an effort to improve its prociency. The relatively new full-field amplitude-modulated time-of-flight range imaging cameras present further complications to registration in the form of measurement errors due to mixed and scattered light. This paper investigates the effectiveness of the most common ICP variants applied to range image data acquired from full-field range imaging cameras. The original ICP algorithm combined with boundary rejection performed the same as or better than the majority of variants tested. In fact, many of these variants proved to decrease the registration alignment
Undue influence: Mitigating range-intensity coupling in AMCW ‘flash’ lidar using scene texture
We present a new algorithm for mitigating range-intensity coupling caused by scattered light in full-field amplitude modulated continuous wave lidar systems using scene texture. Full-field Lidar works using the time-of-flight principle to measure the range to thousands of points in a scene simultaneously. Mixed pixel are erroneous range measurements caused by pixels integrating light from more than one object at a time. Conventional optics suffer from internal reflections and light scattering which can result in every pixel being mixed with scattered light. This causes erroneous range measurements and range-intensity coupling. By measuring how range changes with intensity over local regions it is possible to determine the phase and intensity of the scattered light without the complex calibration inherent in deconvolution based restoration. The new method is shown to produce a substantial improvement in range image quality. An additional range from texture method is demonstrated which is resistant to scattered light. Variations of the algorithms are tested with and without segmentation - the variant without segmentation is faster, but causes erroneous ranges around the edges of objects which are not present in the segmented algorithm
Image intensifier characterization
An image intensifier forms an integral part of a full-field image range finder under development at the University of Waikato. Operating as a high speed shutter with repetition rates up to 100 MHz, a method is described to characterise the response, both temporally and spatially, of the intensifier in order to correct for variations in the field of view and to optimise the operating conditions. A short pulse of visible light is emitted by a laser diode, uniformly illuminating the image intensifier, while a CCD camera captures the output from the intensifier. The phase of the laser pulse is continuously varied using a heterodyne configuration, automatically producing a set of samples covering the modulation cycle. The results show some anomalies in the response of our system and some simple solutions are proposed to correct for these
The synthesis of boron carbide filaments final report
Synthesis, strength, crystal structure, and composite material formation of boron carbide whisker
A synchronised Direct Digital Synthesiser
We describe a Direct Digital Synthesiser (DDS) which provides three frequency-locked synchronised outputs to generate frequencies from DC to 160 MHz. Primarily designed for use in a heterodyning range imaging system, the flexibility of the design allows its use in a number of other applications which require any number of stable, synchronised high frequency outputs. Frequency tuning of 32 bit length provides 0.1 Hz resolution when operating at the maximum clock rate of 400 MSPS, while 14 bit phase tuning provides 0.4 mrad resolution. The DDS technique provides very high relative accuracy between outputs, while the onboard oscillator’s stability of ±1 ppm adds absolute accuracy to the design
A fast Maximum Likelihood method for improving AMCW lidar precision using waveform shape
Amplitude Modulated Continuous Wave imaging lidar systems use the time-of-flight principle to determine the range to objects in a scene. Typical systems use modulated illumination of a scene and a modulated sensor or image intensifier. By changing the relative phase of the two modulation signals it is possible to measure the phase shift induced in the illumination signal, thus the range to the scene. In practical systems, the resultant correlation waveform contains harmonics that typically result in a non-linear range response. Nevertheless, these harmonics can be used to improve range precision. We model a waveform continuously variable in phase and intensity as a linear interpolation. By approximating the problem as a Maximum Likelihood problem, an analytic solution for the problem is derived that enables an entire range image to be processed in a few seconds. A substantial improvement in overall RMS error and precision over the standard Fourier phase analysis approach results
Resolving depth measurement ambiguity with commercially available range imaging cameras
Time-of-flight range imaging is typically performed with the amplitude modulated continuous wave method. This involves illuminating a scene with amplitude modulated light. Reflected light from the scene is received by the sensor with the range to the scene encoded as a phase delay of the modulation envelope. Due to the cyclic nature of phase, an ambiguity in the measured range occurs every half wavelength in distance, thereby limiting the maximum useable range of the camera.
This paper proposes a procedure to resolve depth ambiguity using software post processing. First, the range data is processed to segment the scene into separate objects. The average intensity of each object can then be used to determine which pixels are beyond the non-ambiguous range. The results demonstrate that depth ambiguity can be resolved for various scenes using only the available depth and intensity information. This proposed method reduces the sensitivity to objects with very high and very low reflectance, normally a key problem with basic threshold approaches.
This approach is very flexible as it can be used with any range imaging camera. Furthermore, capture time is not extended, keeping the artifacts caused by moving objects at a minimum. This makes it suitable for applications such as robot vision where the camera may be moving during captures.
The key limitation of the method is its inability to distinguish between two overlapping objects that are separated by a distance of exactly one non-ambiguous range. Overall the reliability of this method is higher than the basic threshold approach, but not as high as the multiple frequency method of resolving ambiguity
- …
