5 research outputs found

    Patch based synthesis for single depth image super-resolution

    Get PDF
    We present an algorithm to synthetically increase the resolution of a solitary depth image using only a generic database of local patches. Modern range sensors measure depths with non-Gaussian noise and at lower starting resolutions than typical visible-light cameras. While patch based approaches for upsampling intensity images continue to improve, this is the first exploration of patching for depth images. We match against the height field of each low resolution input depth patch, and search our database for a list of appropriate high resolution candidate patches. Selecting the right candidate at each location in the depth image is then posed as a Markov random field labeling problem. Our experiments also show how important further depth-specific processing, such as noise removal and correct patch normalization, dramatically improves our results. Perhaps surprisingly, even better results are achieved on a variety of real test scenes by providing our algorithm with only synthetic training depth data

    Algorithm and design improvements for indirect time of flight range imaging cameras

    No full text
    This thesis describes the development of a compact and modularised indirect time of flight range imaging camera. These cameras commonly use the Amplitude Modulated Continuous Wave (AMCW) technique. For this technique, an entire scene is illuminated with light modulated at a high frequency. An image sensor is also modulated and the phase shift introduced between the two modulation signals, due to the transit time of the light reflecting off objects in the scene and returning to the camera, is used to measure the distance. The system constructed for this thesis is controlled by a Cyclone III FPGA and is capable of producing full field of view range images in real time with no additional computational resources. A PMD19K-2 sensor is used as the modulatable image sensor, and is capable of modulation frequencies up to 40 MHz. One significant issue identified with this range imaging technology is that the precision of the range measurements are often dependent on the properties of the object being measured. The dynamic range of the camera is therefore very important when imaging high contrast scenes. Variable Frame Rate Imaging is a novel technique that is developed as part of this thesis and is shown to have promise for addressing this issue. Traditional theory for indirect time of flight cameras is expanded to describe this technique and is experimentally verified. A comparison is made between this technique and traditional High Dynamic Range Imaging. Furthermore, this technique is extended to provide a constant precision measurement of a scene, regardless of the properties of the objects in the scene. It is shown that the replacement of the standard phase detection algorithm with a different algorithm can both reduce the linearity error in the phase measurements caused by harmonics in the correlation waveform and ameliorate axial motion error caused by relative motion of the camera and the object being measured. The new algorithm requires a trivial increase in computational power over the standard algorithm and can be implemented without any significant changes to the standard hardware used in indirect time of flight cameras. Finally, the complete system is evaluated in a number of real world scenarios. Applications in both 3D modelling and mobile robotics are demonstrated and tests are performed for a variety of scenarios including dynamic scenes using a Pioneer 2 robot
    corecore