1,058 research outputs found

    Large-scale Geometric Data Decomposition, Processing and Structured Mesh Generation

    Get PDF
    Mesh generation is a fundamental and critical problem in geometric data modeling and processing. In most scientific and engineering tasks that involve numerical computations and simulations on 2D/3D regions or on curved geometric objects, discretizing or approximating the geometric data using a polygonal or polyhedral meshes is always the first step of the procedure. The quality of this tessellation often dictates the subsequent computation accuracy, efficiency, and numerical stability. When compared with unstructured meshes, the structured meshes are favored in many scientific/engineering tasks due to their good properties. However, generating high-quality structured mesh remains challenging, especially for complex or large-scale geometric data. In industrial Computer-aided Design/Engineering (CAD/CAE) pipelines, the geometry processing to create a desirable structural mesh of the complex model is the most costly step. This step is semi-manual, and often takes up to several weeks to finish. Several technical challenges remains unsolved in existing structured mesh generation techniques. This dissertation studies the effective generation of structural mesh on large and complex geometric data. We study a general geometric computation paradigm to solve this problem via model partitioning and divide-and-conquer. To apply effective divide-and-conquer, we study two key technical components: the shape decomposition in the divide stage, and the structured meshing in the conquer stage. We test our algorithm on vairous data set, the results demonstrate the efficiency and effectiveness of our framework. The comparisons also show our algorithm outperforms existing partitioning methods in final meshing quality. We also show our pipeline scales up efficiently on HPC environment

    3D photogrammetric data modeling and optimization for multipurpose analysis and representation of Cultural Heritage assets

    Get PDF
    This research deals with the issues concerning the processing, managing, representation for further dissemination of the big amount of 3D data today achievable and storable with the modern geomatic techniques of 3D metric survey. In particular, this thesis is focused on the optimization process applied to 3D photogrammetric data of Cultural Heritage assets. Modern Geomatic techniques enable the acquisition and storage of a big amount of data, with high metric and radiometric accuracy and precision, also in the very close range field, and to process very detailed 3D textured models. Nowadays, the photogrammetric pipeline has well-established potentialities and it is considered one of the principal technique to produce, at low cost, detailed 3D textured models. The potentialities offered by high resolution and textured 3D models is today well-known and such representations are a powerful tool for many multidisciplinary purposes, at different scales and resolutions, from documentation, conservation and restoration to visualization and education. For example, their sub-millimetric precision makes them suitable for scientific studies applied to the geometry and materials (i.e. for structural and static tests, for planning restoration activities or for historical sources); their high fidelity to the real object and their navigability makes them optimal for web-based visualization and dissemination applications. Thanks to the improvement made in new visualization standard, they can be easily used as visualization interface linking different kinds of information in a highly intuitive way. Furthermore, many museums look today for more interactive exhibitions that may increase the visitors’ emotions and many recent applications make use of 3D contents (i.e. in virtual or augmented reality applications and through virtual museums). What all of these applications have to deal with concerns the issue deriving from the difficult of managing the big amount of data that have to be represented and navigated. Indeed, reality based models have very heavy file sizes (also tens of GB) that makes them difficult to be handled by common and portable devices, published on the internet or managed in real time applications. Even though recent advances produce more and more sophisticated and capable hardware and internet standards, empowering the ability to easily handle, visualize and share such contents, other researches aim at define a common pipeline for the generation and optimization of 3D models with a reduced number of polygons, however able to satisfy detailed radiometric and geometric requests. iii This thesis is inserted in this scenario and focuses on the 3D modeling process of photogrammetric data aimed at their easy sharing and visualization. In particular, this research tested a 3D models optimization, a process which aims at the generation of Low Polygons models, with very low byte file size, processed starting from the data of High Poly ones, that nevertheless offer a level of detail comparable to the original models. To do this, several tools borrowed from the game industry and game engine have been used. For this test, three case studies have been chosen, a modern sculpture of a contemporary Italian artist, a roman marble statue, preserved in the Civic Archaeological Museum of Torino, and the frieze of the Augustus arch preserved in the city of Susa (Piedmont- Italy). All the test cases have been surveyed by means of a close range photogrammetric acquisition and three high detailed 3D models have been generated by means of a Structure from Motion and image matching pipeline. On the final High Poly models generated, different optimization and decimation tools have been tested with the final aim to evaluate the quality of the information that can be extracted by the final optimized models, in comparison to those of the original High Polygon one. This study showed how tools borrowed from the Computer Graphic offer great potentialities also in the Cultural Heritage field. This application, in fact, may meet the needs of multipurpose and multiscale studies, using different levels of optimization, and this procedure could be applied to different kind of objects, with a variety of different sizes and shapes, also on multiscale and multisensor data, such as buildings, architectural complexes, data from UAV surveys and so on

    Analysis and Generation of Quality Polytopal Meshes with Applications to the Virtual Element Method

    Get PDF
    This thesis explores the concept of the quality of a mesh, the latter being intended as the discretization of a two- or three- dimensional domain. The topic is interdisciplinary in nature, as meshes are massively used in several fields from both the geometry processing and the numerical analysis communities. The goal is to produce a mesh with good geometrical properties and the lowest possible number of elements, able to produce results in a target range of accuracy. In other words, a good quality mesh that is also cheap to handle, overcoming the typical trade-off between quality and computational cost. To reach this goal, we first need to answer the question: ''How, and how much, does the accuracy of a numerical simulation or a scientific computation (e.g., rendering, printing, modeling operations) depend on the particular mesh adopted to model the problem? And which geometrical features of the mesh most influence the result?'' We present a comparative study of the different mesh types, mesh generation techniques, and mesh quality measures currently available in the literature related to both engineering and computer graphics applications. This analysis leads to the precise definition of the notion of quality for a mesh, in the particular context of numerical simulations of partial differential equations with the virtual element method, and the consequent construction of criteria to determine and optimize the quality of a given mesh. Our main contribution consists in a new mesh quality indicator for polytopal meshes, able to predict the performance of the virtual element method over a particular mesh before running the simulation. Strictly related to this, we also define a quality agglomeration algorithm that optimizes the quality of a mesh by wisely agglomerating groups of neighboring elements. The accuracy and the reliability of both tools are thoroughly verified in a series of tests in different scenarios

    Electronic chip cooling system using graphite fins

    Get PDF
    International audienceAs electronic devices get smaller, cooling systems with higher thermal efficiency is demanding by fast growing electronic industry. Great amount of research has been performed on the cooling systems but research on the materials of the cooling systems needs more work. Graphite with high thermal conductivity and light weight is a great candidate to be used in electronic devices. The bottleneck of using graphene in the cooling systems is the thermal transport among the interface from the substrate to the graphene fin system. In this research finite element simulation of graphite fin cooling system has been investigated to study the effect of different applied pressure on the cooling system performance. Study of this cooling system showed good improvement in comparison with common copper fin cooling systems. Introduction

    Aspects of Unstructured Grids and Finite-Volume Solvers for the Euler and Navier-Stokes Equations

    Get PDF
    One of the major achievements in engineering science has been the development of computer algorithms for solving nonlinear differential equations such as the Navier-Stokes equations. In the past, limited computer resources have motivated the development of efficient numerical schemes in computational fluid dynamics (CFD) utilizing structured meshes. The use of structured meshes greatly simplifies the implementation of CFD algorithms on conventional computers. Unstructured grids on the other hand offer an alternative to modeling complex geometries. Unstructured meshes have irregular connectivity and usually contain combinations of triangles, quadrilaterals, tetrahedra, and hexahedra. The generation and use of unstructured grids poses new challenges in CFD. The purpose of this note is to present recent developments in the unstructured grid generation and flow solution technology

    Low-discrepancy point sampling of 2D manifolds for visual computing

    Get PDF
    Point distributions are used to sample surfaces for a wide variety of applications within the fields of graphics and computational geometry, such as point-based graphics, remeshing and area/volume measurement. The quality of such point distributions is important, and quality criteria are often application dependent. Common quality criteria include visual appearance, an even distribution whilst avoiding aliasing and other artifacts, and minimisation of the number of points required to accurately sample a surface. Previous work suggests that discrepancy measures the uniformity of a point distribution and hence a point distribution of minimal discrepancy is expected to be of high quality. We investigate discrepancy as a measure of sampling quality, and present a novel approach for generating low-discrepancy point distributions on parameterised surfaces. Our approach uses the idea of converting the 2D sampling problem into a ID problem by adaptively mapping a space-filling curve onto the surface. A ID sequence is then generated and used to sample the surface along the curve. The sampling process takes into account the parametric mapping, employing a corrective approach similar to histogram equalisation, to ensure that it gives a 2D low-discrepancy point distribution on the surface. The local sampling density can be controlled by a user-defined density function, e.g. to preserve local features, or to achieve desired data reduction rates. Experiments show that our approach efficiently generates low-discrepancy distributions on arbitrary parametric surfaces, demonstrating nearly as good results as popular low-discrepancy sampling methods designed for particular surfaces like planes and spheres. We develop a generalised notion of the standard discrepancy measure, which considers a broader set of sample shapes used to compute the discrepancy. In this more thorough testing, our sampling approach produces results superior to popular distributions. We also demonstrate that the point distributions produced by our approach closely adhere to the blue noise criterion, compared to the popular low-discrepancy methods tested, which show high levels of structure, undesirable for visual representation. Furthermore, we present novel sampling algorithms to generate low-discrepancy distributions on triangle meshes. To sample the mesh, it is cut into a disc topology, and a parameterisation is generated. Our sampling algorithm can then be used to sample the parameterised mesh, using robust methods for computing discrete differential properties of the surface. After these pre-processing steps, the sampling density can be adjusted in real-time. Experiments also show that our sampling approach can accurately resample existing meshes with low discrepancy, demonstrating error rates when reducing the mesh complexity as good as the best results in the literature. We present three applications of our mesh sampling algorithm. We first describe a point- based graphics sampling approach, which includes a global hole-filling algorithm. We investigate the coverage of sample discs for this approach, demonstrating results superior to random sampling and a popular low-discrepancy method. Moreover, we develop levels of detail and view dependent rendering approaches, providing very fine-grained density control with distance and angle, and silhouette enhancement. We further discuss a triangle- based remeshing technique, producing high quality, topologically unaltered meshes. Finally, we describe a complete framework for sampling and painting engineering prototype models. This approach provides density control according to surface texture, and gives full dithering control of the point sample distribution. Results exhibit high quality point distributions for painting that are invariant to surface orientation or complexity. The main contributions of this thesis are novel algorithms to generate high-quality density- controlled point distributions on parametric surfaces and triangular meshes. Qualitative assessment and discrepancy measures and blue noise criteria show their high sampling quality in general. We introduce generalised discrepancy measures which indicate that the sampling quality of our approach is superior to other low-discrepancy sampling techniques. Moreover, we present novel approaches towards remeshing, point-based rendering and robotic painting of prototypes by adapting our sampling algorithms and demonstrate the overall good quality of the results for these specific applications.EThOS - Electronic Theses Online ServiceGBUnited Kingdo

    Low-discrepancy point sampling of 2D manifolds for visual computing

    Get PDF
    Point distributions are used to sample surfaces for a wide variety of applications within the fields of graphics and computational geometry, such as point-based graphics, remeshing and area/volume measurement. The quality of such point distributions is important, and quality criteria are often application dependent. Common quality criteria include visual appearance, an even distribution whilst avoiding aliasing and other artifacts, and minimisation of the number of points required to accurately sample a surface. Previous work suggests that discrepancy measures the uniformity of a point distribution and hence a point distribution of minimal discrepancy is expected to be of high quality. We investigate discrepancy as a measure of sampling quality, and present a novel approach for generating low-discrepancy point distributions on parameterised surfaces. Our approach uses the idea of converting the 2D sampling problem into a ID problem by adaptively mapping a space-filling curve onto the surface. A ID sequence is then generated and used to sample the surface along the curve. The sampling process takes into account the parametric mapping, employing a corrective approach similar to histogram equalisation, to ensure that it gives a 2D low-discrepancy point distribution on the surface. The local sampling density can be controlled by a user-defined density function, e.g. to preserve local features, or to achieve desired data reduction rates. Experiments show that our approach efficiently generates low-discrepancy distributions on arbitrary parametric surfaces, demonstrating nearly as good results as popular low-discrepancy sampling methods designed for particular surfaces like planes and spheres. We develop a generalised notion of the standard discrepancy measure, which considers a broader set of sample shapes used to compute the discrepancy. In this more thorough testing, our sampling approach produces results superior to popular distributions. We also demonstrate that the point distributions produced by our approach closely adhere to the blue noise criterion, compared to the popular low-discrepancy methods tested, which show high levels of structure, undesirable for visual representation. Furthermore, we present novel sampling algorithms to generate low-discrepancy distributions on triangle meshes. To sample the mesh, it is cut into a disc topology, and a parameterisation is generated. Our sampling algorithm can then be used to sample the parameterised mesh, using robust methods for computing discrete differential properties of the surface. After these pre-processing steps, the sampling density can be adjusted in real-time. Experiments also show that our sampling approach can accurately resample existing meshes with low discrepancy, demonstrating error rates when reducing the mesh complexity as good as the best results in the literature. We present three applications of our mesh sampling algorithm. We first describe a point- based graphics sampling approach, which includes a global hole-filling algorithm. We investigate the coverage of sample discs for this approach, demonstrating results superior to random sampling and a popular low-discrepancy method. Moreover, we develop levels of detail and view dependent rendering approaches, providing very fine-grained density control with distance and angle, and silhouette enhancement. We further discuss a triangle- based remeshing technique, producing high quality, topologically unaltered meshes. Finally, we describe a complete framework for sampling and painting engineering prototype models. This approach provides density control according to surface texture, and gives full dithering control of the point sample distribution. Results exhibit high quality point distributions for painting that are invariant to surface orientation or complexity. The main contributions of this thesis are novel algorithms to generate high-quality density- controlled point distributions on parametric surfaces and triangular meshes. Qualitative assessment and discrepancy measures and blue noise criteria show their high sampling quality in general. We introduce generalised discrepancy measures which indicate that the sampling quality of our approach is superior to other low-discrepancy sampling techniques. Moreover, we present novel approaches towards remeshing, point-based rendering and robotic painting of prototypes by adapting our sampling algorithms and demonstrate the overall good quality of the results for these specific applications.EThOS - Electronic Theses Online ServiceGBUnited Kingdo

    Abstractions and performance optimisations for finite element methods

    Get PDF
    Finding numerical solutions to partial differential equations (PDEs) is an essential task in the discipline of scientific computing. In designing software tools for this task, one of the ultimate goals is to balance the needs for generality, ease to use and high performance. Domain-specific systems based on code generation techniques, such as Firedrake, attempt to address this problem with a design consisting of a hierarchy of abstractions, where the users can specify the mathematical problems via a high-level, descriptive interface, which is progressively lowered through the intermediate abstractions. Well-designed abstraction layers are essential to enable performing code transformations and optimisations robustly and efficiently, generating high-performance code without user intervention. This thesis discusses several topics on the design of the abstraction layers of Firedrake, and presents the benefit of its software architecture by providing examples of various optimising code transformations at the appropriate abstraction layers. In particular, we discuss the advantage of describing the local assembly stage of a finite element solver in an intermediate representation based on symbolic tensor algebra. We successfully lift specific loop optimisations, previously implemented by rewriting ASTs of the local assembly kernels, to this higher-level tensor language, improving the compilation speed and optimisation effectiveness. The global assembly phase involves the application of local assembly kernels on a collection of entities of an unstructured mesh. We redesign the abstraction to express the global assembly loop nests using tools and concepts based on the polyhedral model. This enables us to implement the cross-element vectorisation algorithm that delivers stable vectorisation performance on CPUs automatically. This abstraction also improves the portability of Firedrake, as we demonstrate targeting GPU devices transparently from the same software stack.Open Acces

    Low-discrepancy point sampling of 2D manifolds for visual computing

    Get PDF
    Point distributions are used to sample surfaces for a wide variety of applications within the fields of graphics and computational geometry, such as point-based graphics, remeshing and area/volume measurement. The quality of such point distributions is important, and quality criteria are often application dependent. Common quality criteria include visual appearance, an even distribution whilst avoiding aliasing and other artifacts, and minimisation of the number of points required to accurately sample a surface. Previous work suggests that discrepancy measures the uniformity of a point distribution and hence a point distribution of minimal discrepancy is expected to be of high quality. We investigate discrepancy as a measure of sampling quality, and present a novel approach for generating low-discrepancy point distributions on parameterised surfaces. Our approach uses the idea of converting the 2D sampling problem into a ID problem by adaptively mapping a space-filling curve onto the surface. A ID sequence is then generated and used to sample the surface along the curve. The sampling process takes into account the parametric mapping, employing a corrective approach similar to histogram equalisation, to ensure that it gives a 2D low-discrepancy point distribution on the surface. The local sampling density can be controlled by a user-defined density function, e.g. to preserve local features, or to achieve desired data reduction rates. Experiments show that our approach efficiently generates low-discrepancy distributions on arbitrary parametric surfaces, demonstrating nearly as good results as popular low-discrepancy sampling methods designed for particular surfaces like planes and spheres. We develop a generalised notion of the standard discrepancy measure, which considers a broader set of sample shapes used to compute the discrepancy. In this more thorough testing, our sampling approach produces results superior to popular distributions. We also demonstrate that the point distributions produced by our approach closely adhere to the blue noise criterion, compared to the popular low-discrepancy methods tested, which show high levels of structure, undesirable for visual representation. Furthermore, we present novel sampling algorithms to generate low-discrepancy distributions on triangle meshes. To sample the mesh, it is cut into a disc topology, and a parameterisation is generated. Our sampling algorithm can then be used to sample the parameterised mesh, using robust methods for computing discrete differential properties of the surface. After these pre-processing steps, the sampling density can be adjusted in real-time. Experiments also show that our sampling approach can accurately resample existing meshes with low discrepancy, demonstrating error rates when reducing the mesh complexity as good as the best results in the literature. We present three applications of our mesh sampling algorithm. We first describe a point- based graphics sampling approach, which includes a global hole-filling algorithm. We investigate the coverage of sample discs for this approach, demonstrating results superior to random sampling and a popular low-discrepancy method. Moreover, we develop levels of detail and view dependent rendering approaches, providing very fine-grained density control with distance and angle, and silhouette enhancement. We further discuss a triangle- based remeshing technique, producing high quality, topologically unaltered meshes. Finally, we describe a complete framework for sampling and painting engineering prototype models. This approach provides density control according to surface texture, and gives full dithering control of the point sample distribution. Results exhibit high quality point distributions for painting that are invariant to surface orientation or complexity. The main contributions of this thesis are novel algorithms to generate high-quality density- controlled point distributions on parametric surfaces and triangular meshes. Qualitative assessment and discrepancy measures and blue noise criteria show their high sampling quality in general. We introduce generalised discrepancy measures which indicate that the sampling quality of our approach is superior to other low-discrepancy sampling techniques. Moreover, we present novel approaches towards remeshing, point-based rendering and robotic painting of prototypes by adapting our sampling algorithms and demonstrate the overall good quality of the results for these specific applications
    • …
    corecore