Parallax-Based View Synthesis From Uncalibrated Images

Abstract

In this paper we present an image-based system for novel view synthesis from multiple model views. Our method works by segmenting images of a static scene in background and foreground, basing on motion parallax. From this segmentation we are able to recover the relative affine structure. Finally, we synthesize novel views with an original method based on step-wise replication of the epipolar geometry acquired from few model or "seed" views. The method is uncalibrated, for it does not need the rigid displacements in the Euclidean frame (which is unknown), and it is automatic, for it does not require the user to manually specify viewing parameters.

    Similar works

    Full text

    thumbnail-image

    Available Versions