44 research outputs found

    Multi-Resolution Meshes for Feature-Aware Hardware Tessellation

    Get PDF
    International audienceHardware tessellation is de facto the preferred mechanism to adaptively control mesh resolution with maximal performances. However, owing to its fixed and uniform pattern, leveraging tessellation for feature-aware LOD rendering remains a challenging problem. We relax this fundamental constraint by introducing a new spatial and temporal blending mechanism of tessellation levels, which is built on top of a novel hierarchical representation of multi-resolution meshes. This mechanism allows to finely control topological changes so that vertices can be removed or added at the most appropriate location to preserve geometric features in a continuous and artifact-free manner. We then show how to extend edge-collapse based decimation methods to build feature-aware multi-resolution meshes that match the tessellation patterns. Our approach is fully compatible with current hardware tessellators and only adds a small overhead on memory consumption and tessellation cost

    OpenFab: A programmable pipeline for multimaterial fabrication

    Get PDF
    Figure 1: Three rhinos, defined and printed using OpenFab. For each print, the same geometry was paired with a different fablet—a shaderlike program which procedurally defines surface detail and material composition throughout the object volume. This produces three unique prints by using displacements, texture mapping, and continuous volumetric material variation as a function of distance from the surface. 3D printing hardware is rapidly scaling up to output continuous mixtures of multiple materials at increasing resolution over ever larger print volumes. This poses an enormous computational challenge: large high-resolution prints comprise trillions of voxels and petabytes of data and simply modeling and describing the input with spatially varying material mixtures at this scale is challenging. Existing 3D printing software is insufficient; in particular, most software is designed to support only a few million primitives, with discrete material choices per object. We present OpenFab, a programmable pipeline for synthesis of multi-material 3D printed objects that is inspired by RenderMan and modern GPU pipelines. The pipeline supports procedural evaluation of geometric detail and material composition, using shader-like fablets, allowing models to be specified easily and efficiently. We describe a streaming architecture for OpenFab; only a small fraction of the final volume is stored in memory and output is fed to the printer with little startup delay. We demonstrate it on a variety of multi-material objects

    Interactive global illumination on the CPU

    Get PDF
    Computing realistic physically-based global illumination in real-time remains one of the major goals in the fields of rendering and visualisation; one that has not yet been achieved due to its inherent computational complexity. This thesis focuses on CPU-based interactive global illumination approaches with an aim to develop generalisable hardware-agnostic algorithms. Interactive ray tracing is reliant on spatial and cache coherency to achieve interactive rates which conflicts with needs of global illumination solutions which require a large number of incoherent secondary rays to be computed. Methods that reduce the total number of rays that need to be processed, such as Selective rendering, were investigated to determine how best they can be utilised. The impact that selective rendering has on interactive ray tracing was analysed and quantified and two novel global illumination algorithms were developed, with the structured methodology used presented as a framework. Adaptive Inter- leaved Sampling, is a generalisable approach that combines interleaved sampling with an adaptive approach, which uses efficient component-specific adaptive guidance methods to drive the computation. Results of up to 11 frames per second were demonstrated for multiple components including participating media. Temporal Instant Caching, is a caching scheme for accelerating the computation of diffuse interreflections to interactive rates. This approach achieved frame rates exceeding 9 frames per second for the majority of scenes. Validation of the results for both approaches showed little perceptual difference when comparing against a gold-standard path-traced image. Further research into caching led to the development of a new wait-free data access control mechanism for sharing the irradiance cache among multiple rendering threads on a shared memory parallel system. By not serialising accesses to the shared data structure the irradiance values were shared among all the threads without any overhead or contention, when reading and writing simultaneously. This new approach achieved efficiencies between 77% and 92% for 8 threads when calculating static images and animations. This work demonstrates that, due to the flexibility of the CPU, CPU-based algorithms remain a valid and competitive choice for achieving global illumination interactively, and an alternative to the generally brute-force GPU-centric algorithms

    Spatial description of large scale structure and reionisation

    Get PDF
    This thesis is the result of a PhD project that tried to investigate and find new descriptions of entities arising in large scale structure based upon their spatial configuration. For this we analyse N-body simulations of gravitational collapse in a cold dark matter universe with cosmological constant (ΛCDM) and Monte Carlo ray-tracing radiative transfer (MCRTRT) simulations of reionisation. We also use an N-body simulations to investigate possible problems with observational results connected to large scale clustering. In the first part of this thesis we develop a novel technique to characterise the density field in cosmological N-body simulations based upon a density estimate and the connectivity between particles obtained from a Voronoi tessellation of their positions. We use this estimate to find a hierarchical set of peaks in the Millennium and Millennium II simulations. This hierarchy completely decomposes the particle load of the simulations into nodes in a single tree structure we call “Tessellation Level Tree” (TLT). We investigate the properties of these peaks and concentrate on two novel aspects: the percolation of the connected set of peaks above densities of a few (6 − 7) along the cosmic web and the very strong assembly bias effect if peaks are split by saddle point density. This assembly bias effect is the strongest ever obtained from quintiles in a local property of the dark matter distribution in simulations. The second part of the thesis investigates the morphology of ionised bubbles in hydrogen and helium during reionisation. For this we use MCRTRT on regular grids and create binary representations of the ionisation fields using a threshold. We then apply techniques of mathematical morphology to extract a hierarchy of bubbles ordered by local diameter. We show the shift in the global bubble size distribution throughout reionisation and how the ionised volume is more and more unequally distributed among the bubbles as they grow and overlap. The overlap also results in a percolation process we identify at z ≳ 8 that increasingly delocalises the reionisation process. Finally, we connect the bubbles to the properties of the underlying density field. For the first time we show that the largest bubbles in the post-overlap regime are not densest in the centre are very strongly biased with respect to the large scale matter distribution. We also quantify how ionisation reaches the most underdense parts of the universe last, reconfirming the inside-out scenario of reionisation. In the final part of the thesis we test the assembly bias and splashback radius mea- surements claimed by previous publications using clusters obtained with the optical cluster finder redMaPPer. For this we develop a mock-version of the algorithm that incorporates the core aspects of the cluster identification and apply it to a semi-analytic galaxy popula- tion of the Millennium simulation. We show that the claimed concentration differences in the optically selected clusters are most likely stemming from projection effects that arise more in overdense regions, leading to a coupling between concentration and large scale clustering and therefore a false positive assembly bias detection. The claimed splashback radius identification that is inverse in connection with cluster properties compared to the results of numerical simulations is shown to be an artifact of the circular mask of the selection algorithm.Diese Arbeit ist das Ergebnis eines Dissertationsprojektes, das versuchte, neue Beschreibungen von Entitäten in großräumigen Strukturen basierend auf ihrer räumlichen Konfiguration zu finden und zu untersuchen. Dazu analysieren wir N-Körper-Simulationen des gravitativen Kollapses in einem Universum gefüllt kalter dunklen Materie mit kosmologischer Konstante (ΛCDM) und Monte-Carlo-Raytracing Strahlungstransfer (MCRTRT) Simulationen der Reionisation. Wir verwenden auch eine N-Körper-Simulation, um mögliche Probleme mit Beobachtungsergebnissen im Zusammenhang mit großräumigem Clustering zu untersuchen. Im ersten Teil dieser Arbeit entwickeln wir eine neuartige Technik zur Charakterisierung des Dichtefeldes in kosmologischen N-Körper-Simulationen, die auf einer Dichteabschätzung und der Konnektivität zwischen Teilchen basiert, die aus einer Voronoi-Tessellierung ihrer Positionen gewonnen wurden. Wir verwenden diese Schätzung, um einen hierarchischen Satz von Dichtespitzen in der Millennium- und Millennium-II-Simulationen zu finden. Diese Hierarchie zerlegt die Partikelbelastung der Simulationen vollständig in Knoten in einer einzigen Baumstruktur, die wir ”Tessellation Level Tree” (TLT) nennen. Wir untersuchen die Eigenschaften dieser Dichtespitzen und konzentrieren uns auf zwei neuartige Aspekte: die Perkolation des verbundenen Satzes von Dichtespitzen über Dichten von wenigen (6 − 7) entlang des kosmischen Netzes und den sehr starken Assembly-Bias- Effekt, wenn Dichtespitzen nach der Sattelpunktdichte getrennt werden. Dieser Assembly- Bias-Effekt ist der stärkste, der je aus Quintilen in einer lokalen Eigenschaft der Verteilung der Dunklen Materie in Simulationen erhalten wurde. Der zweite Teil der Arbeit untersucht die Morphologie ionisierter Blasen in Wasserstoff und Helium während der Reionisierung. Dazu verwenden wir MCRTRT auf regulären Gittern und erstellen binäre Darstellungen der Ionisationsfelder mithilfe eines Schwellenwerts. Wir wenden dann Techniken der mathematischen Morphologie an, um eine Hierarchie von Blasen zu extrahieren, die nach lokalem Durchmesser geordnet sind. Wir zeigen die Verschiebung der globalen Blasengrößenverteilung während der Reionisierung und wie das ionisierte Volumen immer ungleichmäßiger unter den Blasen verteilt ist, wenn sie wachsen und sich überlappen. Die Überschneidung führt auch zu einem Perko- lationsprozess, den wir bei z ≳ 8 identifizieren, der den Ionisationsprozess zunehmend delokalisiert. Schließlich verbinden wir die Blasen mit den Eigenschaften des darunter liegenden Dichtefeldes. Wir zeigen erstmals, dass die größten Blasen nach der Überlappung nicht im Zentrum am dichtesten sind und einen sehr starken Bias in Bezug auf die großräumige Materieverteilung aufweisen. Wir quantifizieren auch, wie die Reionisierung zuletzt die am wenigsten dichten Teile des Universums erreicht, und bestätigen damit das Inside-Out-Szenario der Reionisierung. Im letzten Teil der Arbeit testen wir die Montage Bias- und Rückfallradiusmessungen, die von früheren Publikationen unter Verwendung von Clustern, die mit dem optischen Clusterfinder redMaPPer erhalten wurden, beansprucht wurden. Dazu entwickeln wir eine Mock-Version des Algorithmus, der die Kernaspekte der Clusteridentifikation berücksichtigt und auf eine semi-analytische Galaxienpopulation der Millennium-Simulation anwendet. Wir zeigen, dass die behaupteten Konzentrationsunterschiede in den optisch ausgewählten Clustern höchstwahrscheinlich auf Projektionseffekte zurückzuführen sind, die eher in überdichten Regionen auftreten, was zu einer Kopplung zwischen Konzentration und großflächigem Clustering und damit zu einer falsch-positiven Verzerrung der Baugruppe führt. Die beanspruchte Rückfallradius-Identifikation, die im Zusammenhang mit Cluster-Eigenschaften im Vergleich zu den Ergebnissen numerischer Simulationen invers ist, erscheint als Artefakt der Kreismaske des Auswahlalgorithmus

    Spatial description of large scale structure and reionisation

    Get PDF
    This thesis is the result of a PhD project that tried to investigate and find new descriptions of entities arising in large scale structure based upon their spatial configuration. For this we analyse N-body simulations of gravitational collapse in a cold dark matter universe with cosmological constant (ΛCDM) and Monte Carlo ray-tracing radiative transfer (MCRTRT) simulations of reionisation. We also use an N-body simulations to investigate possible problems with observational results connected to large scale clustering. In the first part of this thesis we develop a novel technique to characterise the density field in cosmological N-body simulations based upon a density estimate and the connectivity between particles obtained from a Voronoi tessellation of their positions. We use this estimate to find a hierarchical set of peaks in the Millennium and Millennium II simulations. This hierarchy completely decomposes the particle load of the simulations into nodes in a single tree structure we call “Tessellation Level Tree” (TLT). We investigate the properties of these peaks and concentrate on two novel aspects: the percolation of the connected set of peaks above densities of a few (6 − 7) along the cosmic web and the very strong assembly bias effect if peaks are split by saddle point density. This assembly bias effect is the strongest ever obtained from quintiles in a local property of the dark matter distribution in simulations. The second part of the thesis investigates the morphology of ionised bubbles in hydrogen and helium during reionisation. For this we use MCRTRT on regular grids and create binary representations of the ionisation fields using a threshold. We then apply techniques of mathematical morphology to extract a hierarchy of bubbles ordered by local diameter. We show the shift in the global bubble size distribution throughout reionisation and how the ionised volume is more and more unequally distributed among the bubbles as they grow and overlap. The overlap also results in a percolation process we identify at z ≳ 8 that increasingly delocalises the reionisation process. Finally, we connect the bubbles to the properties of the underlying density field. For the first time we show that the largest bubbles in the post-overlap regime are not densest in the centre are very strongly biased with respect to the large scale matter distribution. We also quantify how ionisation reaches the most underdense parts of the universe last, reconfirming the inside-out scenario of reionisation. In the final part of the thesis we test the assembly bias and splashback radius mea- surements claimed by previous publications using clusters obtained with the optical cluster finder redMaPPer. For this we develop a mock-version of the algorithm that incorporates the core aspects of the cluster identification and apply it to a semi-analytic galaxy popula- tion of the Millennium simulation. We show that the claimed concentration differences in the optically selected clusters are most likely stemming from projection effects that arise more in overdense regions, leading to a coupling between concentration and large scale clustering and therefore a false positive assembly bias detection. The claimed splashback radius identification that is inverse in connection with cluster properties compared to the results of numerical simulations is shown to be an artifact of the circular mask of the selection algorithm.Diese Arbeit ist das Ergebnis eines Dissertationsprojektes, das versuchte, neue Beschreibungen von Entitäten in großräumigen Strukturen basierend auf ihrer räumlichen Konfiguration zu finden und zu untersuchen. Dazu analysieren wir N-Körper-Simulationen des gravitativen Kollapses in einem Universum gefüllt kalter dunklen Materie mit kosmologischer Konstante (ΛCDM) und Monte-Carlo-Raytracing Strahlungstransfer (MCRTRT) Simulationen der Reionisation. Wir verwenden auch eine N-Körper-Simulation, um mögliche Probleme mit Beobachtungsergebnissen im Zusammenhang mit großräumigem Clustering zu untersuchen. Im ersten Teil dieser Arbeit entwickeln wir eine neuartige Technik zur Charakterisierung des Dichtefeldes in kosmologischen N-Körper-Simulationen, die auf einer Dichteabschätzung und der Konnektivität zwischen Teilchen basiert, die aus einer Voronoi-Tessellierung ihrer Positionen gewonnen wurden. Wir verwenden diese Schätzung, um einen hierarchischen Satz von Dichtespitzen in der Millennium- und Millennium-II-Simulationen zu finden. Diese Hierarchie zerlegt die Partikelbelastung der Simulationen vollständig in Knoten in einer einzigen Baumstruktur, die wir ”Tessellation Level Tree” (TLT) nennen. Wir untersuchen die Eigenschaften dieser Dichtespitzen und konzentrieren uns auf zwei neuartige Aspekte: die Perkolation des verbundenen Satzes von Dichtespitzen über Dichten von wenigen (6 − 7) entlang des kosmischen Netzes und den sehr starken Assembly-Bias- Effekt, wenn Dichtespitzen nach der Sattelpunktdichte getrennt werden. Dieser Assembly- Bias-Effekt ist der stärkste, der je aus Quintilen in einer lokalen Eigenschaft der Verteilung der Dunklen Materie in Simulationen erhalten wurde. Der zweite Teil der Arbeit untersucht die Morphologie ionisierter Blasen in Wasserstoff und Helium während der Reionisierung. Dazu verwenden wir MCRTRT auf regulären Gittern und erstellen binäre Darstellungen der Ionisationsfelder mithilfe eines Schwellenwerts. Wir wenden dann Techniken der mathematischen Morphologie an, um eine Hierarchie von Blasen zu extrahieren, die nach lokalem Durchmesser geordnet sind. Wir zeigen die Verschiebung der globalen Blasengrößenverteilung während der Reionisierung und wie das ionisierte Volumen immer ungleichmäßiger unter den Blasen verteilt ist, wenn sie wachsen und sich überlappen. Die Überschneidung führt auch zu einem Perko- lationsprozess, den wir bei z ≳ 8 identifizieren, der den Ionisationsprozess zunehmend delokalisiert. Schließlich verbinden wir die Blasen mit den Eigenschaften des darunter liegenden Dichtefeldes. Wir zeigen erstmals, dass die größten Blasen nach der Überlappung nicht im Zentrum am dichtesten sind und einen sehr starken Bias in Bezug auf die großräumige Materieverteilung aufweisen. Wir quantifizieren auch, wie die Reionisierung zuletzt die am wenigsten dichten Teile des Universums erreicht, und bestätigen damit das Inside-Out-Szenario der Reionisierung. Im letzten Teil der Arbeit testen wir die Montage Bias- und Rückfallradiusmessungen, die von früheren Publikationen unter Verwendung von Clustern, die mit dem optischen Clusterfinder redMaPPer erhalten wurden, beansprucht wurden. Dazu entwickeln wir eine Mock-Version des Algorithmus, der die Kernaspekte der Clusteridentifikation berücksichtigt und auf eine semi-analytische Galaxienpopulation der Millennium-Simulation anwendet. Wir zeigen, dass die behaupteten Konzentrationsunterschiede in den optisch ausgewählten Clustern höchstwahrscheinlich auf Projektionseffekte zurückzuführen sind, die eher in überdichten Regionen auftreten, was zu einer Kopplung zwischen Konzentration und großflächigem Clustering und damit zu einer falsch-positiven Verzerrung der Baugruppe führt. Die beanspruchte Rückfallradius-Identifikation, die im Zusammenhang mit Cluster-Eigenschaften im Vergleich zu den Ergebnissen numerischer Simulationen invers ist, erscheint als Artefakt der Kreismaske des Auswahlalgorithmus

    Hierarchical Variance Reduction Techniques for Monte Carlo Rendering

    Get PDF
    Ever since the first three-dimensional computer graphics appeared half a century ago, the goal has been to model and simulate how light interacts with materials and objects to form an image. The ultimate goal is photorealistic rendering, where the created images reach a level of accuracy that makes them indistinguishable from photographs of the real world. There are many applications ñ visualization of products and architectural designs yet to be built, special effects, computer-generated films, virtual reality, and video games, to name a few. However, the problem has proven tremendously complex; the illumination at any point is described by a recursive integral to which a closed-form solution seldom exists. Instead, computer simulation and Monte Carlo methods are commonly used to statistically estimate the result. This introduces undesirable noise, or variance, and a large body of research has been devoted to finding ways to reduce the variance. I continue along this line of research, and present several novel techniques for variance reduction in Monte Carlo rendering, as well as a few related tools. The research in this dissertation focuses on using importance sampling to pick a small set of well-distributed point samples. As the primary contribution, I have developed the first methods to explicitly draw samples from the product of distant high-frequency lighting and complex reflectance functions. By sampling the product, low noise results can be achieved using a very small number of samples, which is important to minimize the rendering times. Several different hierarchical representations are explored to allow efficient product sampling. In the first publication, the key idea is to work in a compressed wavelet basis, which allows fast evaluation of the product. Many of the initial restrictions of this technique were removed in follow-up work, allowing higher-resolution uncompressed lighting and avoiding precomputation of reflectance functions. My second main contribution is to present one of the first techniques to take the triple product of lighting, visibility and reflectance into account to further reduce the variance in Monte Carlo rendering. For this purpose, control variates are combined with importance sampling to solve the problem in a novel way. A large part of the technique also focuses on analysis and approximation of the visibility function. To further refine the above techniques, several useful tools are introduced. These include a fast, low-distortion map to represent (hemi)spherical functions, a method to create high-quality quasi-random points, and an optimizing compiler for analyzing shaders using interval arithmetic. The latter automatically extracts bounds for importance sampling of arbitrary shaders, as opposed to using a priori known reflectance functions. In summary, the work presented here takes the field of computer graphics one step further towards making photorealistic rendering practical for a wide range of uses. By introducing several novel Monte Carlo methods, more sophisticated lighting and materials can be used without increasing the computation times. The research is aimed at domain-specific solutions to the rendering problem, but I believe that much of the new theory is applicable in other parts of computer graphics, as well as in other fields

    Distance based heterogeneous volume modelling.

    Get PDF
    Natural objects, such as bones and watermelons, often have a heterogeneous composition and complex internal structures. Material properties inside the object can change abruptly or gradually, and representing such changes digitally can be problematic. Attribute functions represent physical properties distribution in the volumetric object. Modelling complex attributes within a volume is a complex task. There are several approaches to modelling attributes, but distance functions have gained popularity for heterogeneous object modelling because, in addition to their usefulness, they lead to predictability and intuitiveness. In this thesis, we consider a unified framework for heterogeneous volume modelling, specifically using distance fields. In particular, we tackle various issues associated with them such as the interpolation of volumetric attributes through time for shape transformation and intuitive and predictable interpolation of attributes inside a shape. To achieve these results, we rely on smooth approximate distance fields and interior distances. This thesis deals with outstanding issues in heterogeneous object modelling, and more specifically in modelling functionally graded materials and structures using different types of distances and approximation thereof. We demonstrate the benefits of heterogeneous volume modelling using smooth approximate distance fields with various applications, such as adaptive microstructures, morphological shape generation, shape driven interpolation of material properties through time and shape conforming interpolation of properties. Distance based modelling of attributes allows us to have a better parametrization of the object volume and design gradient properties across an object. This becomes more important nowadays with the growing interest in rapid prototyping and digital fabrication of heterogeneous objects and can find practical applications in different industries
    corecore