Spacetime Tetrahedra: Bildbasierte Blickpunktnavigation durch Raum und Zeit

Abstract

We present a purely image-based rendering system to viewpoint-navigate through space and time of arbitrary dynamic scenes. Unlike previous methods, our approach does not rely on synchronized and calibrated multi-video footage as input. Instead of estimating scene depth or reconstructing 3D geometry, our approach is based on dense image correspondences, treating view interpolation equally in space and time. In a nutshell, we tetrahedrally partition the volume spanned by camera directions and time, determine the warp field along each tetrahedral edge, and warp-blend-interpolate any viewpoint inside a tetrahedron from the four video frames representing its vertices. Besides fast and easy acquisition to make outdoor recordings feasible, our space-time symmetric approach allows for smooth interpolation of view perspective and time, i.e., for simultaneous free-viewpoint and slow motion rendering

    Similar works