Image-Based Remodeling: A Framework for Creating, Visualizing, and Editing Image-Based Models

Abstract

Thesis (Ph.D.)--University of Washington, 2014Image-based models are geometric models created from photographs and textured with the photographs for realistic rendering. In recent years, it has become increasingly easy to capture many photographs of an object and use computer vision techniques to model the object with image-based geometry. However, it can be difficult to display and interact with these models in a manner that reproduces the visual fidelity of the original photographs. It is even more difficult to support the interactive manipulation and editing that one might expect from other types of geometric models. The overarching goal of this thesis is to create a framework for building, visualizing, and editing image-based models. Toward this goal, I first describe the related literature and outline the associated research challenges related to geometric representation, navigation, creation, and editing of image-based geometry. One of the challenges in visualizing image-based geometry is discovering camera viewpoints and navigation paths that are both visually pleasing and enable story telling. To address this challenge, I describe a GPU-accelerated, image-based rendering algorithm that enables the creation of such visualizations in the form of camera paths and cinematic effects commonly used by cinematographers. In order to avoid objectionable rendering artifacts such as occlusion holes, the fast rendering algorithm is used to quickly sample the parameter space of camera viewpoints and define a viable region of camera parameters. This region is subsequently used to constrain an optimization for a camera path that maximizes parallax and conforms to cinematic conventions. This rendering algorithm also allows users to create more complex camera paths interactively, while experimenting with effects such as focal length, depth of field, and selective, depth-based desaturation or brightening. A greater challenge is editing image-based geometry. To move image-based models away from the limitations of view-only geometry to the broader class of editable geometry, I describe a new approach for creating and visualizing editable image-based architectural models in a photorealistic manner. Using this approach, I present an interactive system that enables modeling, and remodeling of image-based geometry in the context of home interior architecture. This system supports creation of concise, parameterized, and constrained geometry, as well as remodeling directly from within the photographs. Real-time texturing of modified geometry is made possible by precomputing view-dependent textures for all of the faces that are potentially visible to each original camera viewpoint, blending multiple viewpoints and hole-filling where necessary. The resulting textures are stored and accessed efficiently, enabling intuitive, real-time, realistic visualization, modeling, and editing of the building interior. Finally, I demonstrate how the image-based remodeling system enables lighting effects. Using a combination of the texture created in the image-based remodeling system and radiosity form-factor calculations, we estimate the irradiance at any location in the model. We can utilize this irradiance estimate to further refine light source estimates, and calculate surface reflectance properties. Given the additional estimates, edited models can be re-rendered to reflect lighting changes due to edited geometry, changing light properties, and the addition of synthetic objects

    Similar works