Point-Based Surfaces can be directly generated by 3D scanners and avoid the generation and storage of an explicit topology for a sampled geometry, which saves time and storage space for very dense and large objects, such as scanned statues and other archaeological artefacts [DDGM ∗]. We propose a fast processing pipeline of large point-based surfaces for real-time, appearance preserving, polygonal rendering. Our goal is to reduce the time needed between a point set made of hundred of millions samples and a high resolution visualization taking benefit of modern graphics hardware, tuned for normal mapping of polygons. Our approach starts by an out-of-core generation of a coarse local triangulation of the original model. The resulting coarse mesh is enriched by applying a set of maps which capture the high frequency features of the original data set. We choose as an example the normal component of samples for these maps, since normal maps provide efficiently an accurate local illumination. But our approach is also suitable for other point attributes such as color or position (displacement map). These maps come also from an out-of-core process, using the complete input data in a streaming process. Sampling issues of the maps are addressed using an efficient diffusion algorithm in 2D. Our main contribution is to directly handle such large unorganized point clouds through this two pass algorithm, without the time-consuming meshing or parameterization step, required by current state-of-the-art high resolution visualization methods. One of th
To submit an update or takedown request for this paper, please submit an Update/Correction/Removal Request.