In this paper we present a new approach to the synthesis of novel views from two images given by an uncalibrated stereo system. Unlike methods based on inferring the 3D structure of the scene or on using dense correspondence between source images to produce a new synthesized view, we use epipolar constraints associated with two cameras configuration and represented by a fundamental matrix to reproject corresponding features in the image plane of the view to be synthesized. This requires only sparse correspondence between features in the source images. Perspective image warping is used to render the remaining dense set of image points via texture mapping. This new approach allows interactive view synthesis in applications such as: immersive telepresence systems, virtual and augmented reality and telerobotics. Only an initialization process which consists in matching features between the source views is needed. The efficiency of the method is illustrated on images of synthetic and real scenes
To submit an update or takedown request for this paper, please submit an Update/Correction/Removal Request.