1 research outputs found

    Real-time Backward Disparity-based Rendering for Dynamic Scenes using Programmable Graphics Hardware

    No full text
    Figure 1: Novel views synthesized using different approaches. Since the scene is sparsely sampled using 16 images, the color interpolation-based approach only keeps part of the scene (background wall) in focus and the rest is blurry. With additional estimated disparity information, the forward warping-based approach gives shaper result, but artifacts show up due to errors in the disparity maps. Using the same set of disparity maps, our approach yields better result. This paper presents a backward disparity-based rendering algorithm, which runs at real-time speed on programmable graphics hardware. The algorithm requires only a handful of image samples of the scene and estimated noisy disparity maps, whereas most existing techniques need either dense samples or accurate depth information. To color a given pixel in the novel view, a backward searching process is conducted to find the corresponding pixels from the closest four reference images. The use of backward searching process makes the algorithm more robust to errors in estimated disparity maps than existing forward warping-based approaches. In addition, since the computations for different pixels are independent, they can be performed in parallel on the Graphics Processing Units of modern graphics hardware. Experiment results demonstrate that our algorithm can synthesize accurate novel views for dynamic real scenes at a high frame rate
    corecore