7,509 research outputs found

    Algorithm and design improvements for indirect time of flight range imaging cameras

    No full text
    This thesis describes the development of a compact and modularised indirect time of flight range imaging camera. These cameras commonly use the Amplitude Modulated Continuous Wave (AMCW) technique. For this technique, an entire scene is illuminated with light modulated at a high frequency. An image sensor is also modulated and the phase shift introduced between the two modulation signals, due to the transit time of the light reflecting off objects in the scene and returning to the camera, is used to measure the distance. The system constructed for this thesis is controlled by a Cyclone III FPGA and is capable of producing full field of view range images in real time with no additional computational resources. A PMD19K-2 sensor is used as the modulatable image sensor, and is capable of modulation frequencies up to 40 MHz. One significant issue identified with this range imaging technology is that the precision of the range measurements are often dependent on the properties of the object being measured. The dynamic range of the camera is therefore very important when imaging high contrast scenes. Variable Frame Rate Imaging is a novel technique that is developed as part of this thesis and is shown to have promise for addressing this issue. Traditional theory for indirect time of flight cameras is expanded to describe this technique and is experimentally verified. A comparison is made between this technique and traditional High Dynamic Range Imaging. Furthermore, this technique is extended to provide a constant precision measurement of a scene, regardless of the properties of the objects in the scene. It is shown that the replacement of the standard phase detection algorithm with a different algorithm can both reduce the linearity error in the phase measurements caused by harmonics in the correlation waveform and ameliorate axial motion error caused by relative motion of the camera and the object being measured. The new algorithm requires a trivial increase in computational power over the standard algorithm and can be implemented without any significant changes to the standard hardware used in indirect time of flight cameras. Finally, the complete system is evaluated in a number of real world scenarios. Applications in both 3D modelling and mobile robotics are demonstrated and tests are performed for a variety of scenarios including dynamic scenes using a Pioneer 2 robot

    Design and construction of a configurable full-field range imaging system for mobile robotic applications

    Get PDF
    Mobile robotic devices rely critically on extrospection sensors to determine the range to objects in the robot’s operating environment. This provides the robot with the ability both to navigate safely around obstacles and to map its environment and hence facilitate path planning and navigation. There is a requirement for a full-field range imaging system that can determine the range to any obstacle in a camera lens’ field of view accurately and in real-time. This paper details the development of a portable full-field ranging system whose bench-top version has demonstrated sub-millimetre precision. However, this precision required non-real-time acquisition rates and expensive hardware. By iterative replacement of components, a portable, modular and inexpensive version of this full-field ranger has been constructed, capable of real-time operation with some (user-defined) trade-off with precision

    Development of a real-time full-field range imaging system

    Get PDF
    This article describes the development of a full-field range imaging system employing a high frequency amplitude modulated light source and image sensor. Depth images are produced at video frame rates in which each pixel in the image represents distance from the sensor to objects in the scene. The various hardware subsystems are described as are the details about the firmware and software implementation for processing the images in real-time. The system is flexible in that precision can be traded off for decreased acquisition time. Results are reported to illustrate this versatility for both high-speed (reduced precision) and high-precision operating modes

    Multiple frequency range imaging to remove measurement ambiguity

    Get PDF
    Range imaging systems use a specialised sensor to capture an image where object distance (range) is measured for every pixel using time-of-flight. The scene is illuminated with an amplitude modulated light source, and the phase of the modulation envelope of the reflected light is measured to determine flight time, hence object distance for each pixel. As the modulation waveform is cyclic, an ambiguity problem exists if the phase shift exceeds 2π radians. To overcome this problem we demonstrate a method that superposes two different modulation frequencies within a single capture. This technique reduces the associated overhead compared with performing two sequential measurements, allowing the system to retain high range measurement precision at rapid acquisition rates. A method is also provided to avoid interference from aliased harmonics during sampling, which otherwise contaminate the resulting range measurement. Experimental results show the potential of the multiple frequency approach; producing high measurement precision while avoiding ambiguity. The results also demonstrate the limitation of this technique, where large errors can be introduced through a combination of a low signal to noise ratio and suboptimal selection of system parameters

    Highly precise AMCW time-of-flight scanning sensor based on digital-parallel demodulation

    Full text link
    In this paper, a novel amplitude-modulated continuous wave (AMCW) time-of-flight (ToF) scanning sensor based on digital-parallel demodulation is proposed and demonstrated in the aspect of distance measurement precision. Since digital-parallel demodulation utilizes a high-amplitude demodulation signal with zero-offset, the proposed sensor platform can maintain extremely high demodulation contrast. Meanwhile, as all cross correlated samples are calculated in parallel and in extremely short integration time, the proposed sensor platform can utilize a 2D laser scanning structure with a single photo detector, maintaining a moderate frame rate. This optical structure can increase the received optical SNR and remove the crosstalk of image pixel array. Based on these measurement properties, the proposed AMCW ToF scanning sensor shows highly precise 3D depth measurement performance. In this study, this precise measurement performance is explained in detail. Additionally, the actual measurement performance of the proposed sensor platform is experimentally validated under various conditions

    In Vivo Time- Resolved Microtomography Reveals the Mechanics of the Blowfly Flight Motor

    Get PDF
    Dipteran flies are amongst the smallest and most agile of flying animals. Their wings are driven indirectly by large power muscles, which cause cyclical deformations of the thorax that are amplified through the intricate wing hinge. Asymmetric flight manoeuvres are controlled by 13 pairs of steering muscles acting directly on the wing articulations. Collectively the steering muscles account for <3% of total flight muscle mass, raising the question of how they can modulate the vastly greater output of the power muscles during manoeuvres. Here we present the results of a synchrotron-based study performing micrometre-resolution, time-resolved microtomography on the 145 Hz wingbeat of blowflies. These data represent the first four-dimensional visualizations of an organism's internal movements on sub-millisecond and micrometre scales. This technique allows us to visualize and measure the three-dimensional movements of five of the largest steering muscles, and to place these in the context of the deforming thoracic mechanism that the muscles actuate. Our visualizations show that the steering muscles operate through a diverse range of nonlinear mechanisms, revealing several unexpected features that could not have been identified using any other technique. The tendons of some steering muscles buckle on every wingbeat to accommodate high amplitude movements of the wing hinge. Other steering muscles absorb kinetic energy from an oscillating control linkage, which rotates at low wingbeat amplitude but translates at high wingbeat amplitude. Kinetic energy is distributed differently in these two modes of oscillation, which may play a role in asymmetric power management during flight control. Structural flexibility is known to be important to the aerodynamic efficiency of insect wings, and to the function of their indirect power muscles. We show that it is integral also to the operation of the steering muscles, and so to the functional flexibility of the insect flight motor

    Computational periscopy with an ordinary digital camera

    Full text link
    Computing the amounts of light arriving from different directions enables a diffusely reflecting surface to play the part of a mirror in a periscope—that is, perform non-line-of-sight imaging around an obstruction. Because computational periscopy has so far depended on light-travel distances being proportional to the times of flight, it has mostly been performed with expensive, specialized ultrafast optical systems^1,2,3,4,5,6,7,8,9,10,11,12. Here we introduce a two-dimensional computational periscopy technique that requires only a single photograph captured with an ordinary digital camera. Our technique recovers the position of an opaque object and the scene behind (but not completely obscured by) the object, when both the object and scene are outside the line of sight of the camera, without requiring controlled or time-varying illumination. Such recovery is based on the visible penumbra of the opaque object having a linear dependence on the hidden scene that can be modelled through ray optics. Non-line-of-sight imaging using inexpensive, ubiquitous equipment may have considerable value in monitoring hazardous environments, navigation and detecting hidden adversaries.We thank F. Durand, W. T. Freeman, Y. Ma, J. Rapp, J. H. Shapiro, A. Torralba, F. N. C. Wong and G. W. Wornell for discussions. This work was supported by the Defense Advanced Research Projects Agency (DARPA) REVEAL Program contract number HR0011-16-C-0030. (HR0011-16-C-0030 - Defense Advanced Research Projects Agency (DARPA) REVEAL Program)Accepted manuscrip
    • 

    corecore