127 research outputs found

    Design and construction of a configurable full-field range imaging system for mobile robotic applications

    Get PDF
    Mobile robotic devices rely critically on extrospection sensors to determine the range to objects in the robot’s operating environment. This provides the robot with the ability both to navigate safely around obstacles and to map its environment and hence facilitate path planning and navigation. There is a requirement for a full-field range imaging system that can determine the range to any obstacle in a camera lens’ field of view accurately and in real-time. This paper details the development of a portable full-field ranging system whose bench-top version has demonstrated sub-millimetre precision. However, this precision required non-real-time acquisition rates and expensive hardware. By iterative replacement of components, a portable, modular and inexpensive version of this full-field ranger has been constructed, capable of real-time operation with some (user-defined) trade-off with precision

    Development of a Full-Field Time-of-Flight Range Imaging System

    Get PDF
    A full-field, time-of-flight, image ranging system or 3D camera has been developed from a proof-of-principle to a working prototype stage, capable of determining the intensity and range for every pixel in a scene. The system can be adapted to the requirements of various applications, producing high precision range measurements with sub-millimetre resolution, or high speed measurements at video frame rates. Parallel data acquisition at each pixel provides high spatial resolution independent of the operating speed. The range imaging system uses a heterodyne technique to indirectly measure time of flight. Laser diodes with highly diverging beams are intensity modulated at radio frequencies and used to illuminate the scene. Reflected light is focused on to an image intensifier used as a high speed optical shutter, which is modulated at a slightly different frequency to that of the laser source. The output from the shutter is a low frequency beat signal, which is sampled by a digital video camera. Optical propagation delay is encoded into the phase of the beat signal, hence from a captured time variant intensity sequence, the beat signal phase can be measured to determine range for every pixel in the scene. A direct digital synthesiser (DDS) is designed and constructed, capable of generating up to three outputs at frequencies beyond 100 MHz with the relative frequency stability in excess of nine orders of magnitude required to control the laser and shutter modulation. Driver circuits were also designed to modulate the image intensifier photocathode at 50 Vpp, and four laser diodes with a combined power output of 320 mW, both over a frequency range of 10-100 MHz. The DDS, laser, and image intensifier response are characterised. A unique method of measuring the image intensifier optical modulation response is developed, requiring the construction of a pico-second pulsed laser source. This characterisation revealed deficiencies in the measured responses, which were mitigated through hardware modifications where possible. The effects of remaining imperfections, such as modulation waveform harmonics and image intensifier irising, can be calibrated and removed from the range measurements during software processing using the characterisation data. Finally, a digital method of generating the high frequency modulation signals using a FPGA to replace the analogue DDS is developed, providing a highly integrated solution, reducing the complexity, and enhancing flexibility. In addition, a novel modulation coding technique is developed to remove the undesirable influence of waveform harmonics from the range measurement without extending the acquisition time. When combined with a proposed modification to the laser illumination source, the digital system can enhance range measurement precision and linearity. From this work, a flexible full-field image ranging system is successfully realised. The system is demonstrated operating in a high precision mode with sub-millimetre depth resolution, and also in a high speed mode operating at video update rates (25 fps), in both cases providing high (512 512) spatial resolution over distances of several metres

    A power-saving modulation technique for time-of-flight range imaging sensors

    Get PDF
    Time-of-flight range imaging cameras measure distance and intensity simultaneously for every pixel in an image. With the continued advancement of the technology, a wide variety of new depth sensing applications are emerging; however a number of these potential applications have stringent electrical power constraints that are difficult to meet with the current state-of-the-art systems. Sensor gain modulation contributes a significant proportion of the total image sensor power consumption, and as higher spatial resolution range image sensors operating at higher modulation frequencies (to achieve better measurement precision) are developed, this proportion is likely to increase. The authors have developed a new sensor modulation technique using resonant circuit concepts that is more power efficient than the standard mode of operation. With a proof of principle system, a 93–96% reduction in modulation drive power was demonstrated across a range of modulation frequencies from 1–11 MHz. Finally, an evaluation of the range imaging performance revealed an improvement in measurement linearity in the resonant configuration due primarily to the more sinusoidal shape of the resonant electrical waveforms, while the average precision values were comparable between the standard and resonant operating modes

    Efficient and Fast Implementation of Embedded Time-of-Flight Ranging System Based on FPGAs

    Get PDF

    FPGA Based Pattern Generation and Synchonization for High Speed Structured Light 3D Camera

    Get PDF
    Recently, structured light 3D imaging devices have gained a keen attention due to their potential applications to robotics, industrial manufacturing and medical imaging. Most of these applications require high 3D precision yet high speed in image capturing for hard and/or soft real time environments. This paper presents a method of high speed image capturing for structured light 3D imaging sensors with FPGA based structured light pattern generation and projector-camera synchronization. Suggested setup reduces the time for pattern projection and camera triggering to 16msec from 100msec that should be required by conventional methods

    Doppler Lidar Sensor for Precision Landing on the Moon and Mars

    Get PDF
    Landing mission concepts that are being developed for exploration of planetary bodies are increasingly ambitious in their implementations and objectives. Most of these missions require accurate position and velocity data during their descent phase in order to ensure safe soft landing at the pre-designated sites. To address this need, a Doppler lidar is being developed by NASA under the Autonomous Landing and Hazard Avoidance (ALHAT) project. This lidar sensor is a versatile instrument capable of providing precision velocity vectors, vehicle ground relative altitude, and attitude. The capabilities of this advanced technology have been demonstrated through two helicopter flight test campaigns conducted over a vegetation-free terrain in 2008 and 2010. Presently, a prototype version of this sensor is being assembled for integration into a rocket-powered terrestrial free-flyer vehicle. Operating in a closed loop with vehicle's guidance and navigation system, the viability of this advanced sensor for future landing missions will be demonstrated through a series of flight tests in 2012

    Automotive Three-Dimensional Vision Through a Single-Photon Counting SPAD Camera

    Get PDF
    We present an optical 3-D ranging camera for automotive applications that is able to provide a centimeter depth resolution over a mbox{40}^{\circ} \times mbox{20}^{\circ} field of view up to 45 m with just 1.5 W of active illumination at 808 nm. The enabling technology we developed is based on a CMOS imager chip of 64 \times 32 pixels, each with a single-photon avalanche diode (SPAD) and three 9-bit digital counters, able to perform lock-in time-of-flight calculation of individual photons emitted by a laser illuminator, reflected by the objects in the scene, and eventually detected by the camera. Due to the SPAD single-photon sensitivity and the smart in-pixel processing, the camera provides state-of-the-art performance at both high frame rates and very low light levels without the need for scanning and with global shutter benefits. Furthermore, the CMOS process is automotive certified

    Range-resolved optical interferometric signal processing

    Get PDF
    The ability to identify the range of an interferometric signal is very useful in interferometry, allowing the suppression of parasitic signal components or permitting several signal sources to be multiplexed. Two novel range-resolved optical interferometric signal processing techniques, employing very different working principles, are theoretically described and experimentally demonstrated in this thesis. The first technique is based on code-division multiplexing (CDM), which is combined with single-sideband signal processing, resulting in a technique that, unlike prior work, only uses a single, regular electro-optic phase modulator to perform both range-based signal identification and interferometric phase evaluation. The second approach uses sinusoidal optical frequency modulation (SFM), induced by injection current modulation of a diode laser, to introduce range-dependent carriers to determine phase signals in interferometers of non-zero optical path difference. Here, a key innovation is the application of a smooth window function, which, when used together with a time-variant demodulation approach, allows optical path lengths of constituent interferometers to be continuously and independently variable, subject to a minimum separation, greatly increasing the practicality of the approach. Both techniques are applied to fibre segment interferometry, where fibre segments that act as long-gauge length interferometric sensors are formed between pairs of partial in-fibre reflectors. Using a regular single-mode laser diode, six fibre segments of length 12.5 cm are multiplexed with a quadrature bandwidth of 43 kHz and a phase noise floor of 0.19 mrad · Hz -0.5 using the SFM technique. In contrast, the 16.5 m spatial resolution achieved with the CDM technique points towards its applicability in medium-to-long range sensing. The SFM technique also allows high linearity, with cyclic errors as low as 1 mrad demonstrated, and with modelling indicating further room for improvement. Additionally, in an industrial measurement, the SFM technique is applied to single-beam, multi-surface vibrometry, allowing simultaneous differential measurements between two vibrating surfaces

    Development of a Compact, Configurable, Real-Time Range Imaging System

    No full text
    This thesis documents the development of a time-of-flight (ToF) camera suitable for autonomous mobile robotics applications. By measuring the round trip time of emitted light to and from objects in the scene, the system is capable of simultaneous full-field range imaging. This is achieved by projecting amplitude modulated continuous wave (AMCW) light onto the scene, and recording the reflection using an image sensor array with a high-speed shutter amplitude modulated at the same frequency (of the order of tens of MHz). The effect is to encode the phase delay of the reflected light as a change in pixel intensity, which is then interpreted as distance. A full field range imaging system has been constructed based on the PMD Technologies PMD19k image sensor, where the high-speed shuttering mechanism is builtin to the integrated circuit. This produces a system that is considerably more compact and power efficient than previous iterations that employed an image intensifier to provide sensor modulation. The new system has comparable performance to commercially available systems in terms of distance measurement precision and accuracy, but is much more flexible with regards to its operating parameters. All of the operating parameters, including the image integration time, sensor modulation phase offset and modulation frequency can be changed in realtime either manually or automatically through software. This highly configurable system serves as an excellent platform for research into novel range imaging techniques. One promising technique is the utilisation of measurements using multiple modulation frequencies in order to maximise precision over an extended operating range. Each measurement gives an independent estimate of the distance with limited range depending on the modulation frequency. These are combined to give a measurement with extended maximum range using a novel algorithm based on the New Chinese Remainder Theorem. A theoretical model for the measurement precision and accuracy of the new algorithm is presented and verified with experimental results. All distance image processing is performed on a per-pixel basis in real-time using a Field Programmable Gate Array (FPGA). An efficient hardware implementation of the phase determination algorithm for calculating distance is investigated. The limiting resource for such an implementation is random access memory (RAM), and a detailed analysis of the trade-off between this resource and measurement precision is also presented
    corecore