397 research outputs found

    Transfinite interpolation for well-definition in error analysis in solid modelling

    Get PDF
    An overall approach to the problem of error analysis in the context of solid modelling, analogous to the standard forward/backward error analysis of Numerical Analysis, was described in a recent paper by Hoffmann and Stewart. An important subproblem within this overall approach is the well-definition of the sets specified by inconsistent data. These inconsistencies may come from the use of finite-precision real-number arithmetic, from the use of low-degree curves to approximate boundaries, or from terminating an infinite convergent (subdivision) process after only a finite number of steps. An earlier paper, by Andersson and the present authors, showed how to resolve this problem of well-definition, in the context of standard trimmed-NURBS representations, by using the Whitney Extension Theorem. In this paper we will show how an analogous approach can be used in the context of trimmed surfaces based on combined-subdivision representations, such as those proposed by Litke, Levin and Schröder. A further component of the problem of well-definition is ensuring that adjacent patches in a representation do not have extraneous intersections. (Here, "extraneous intersections" refers to intersections, between two patches forming part of the boundary, other than prescribed intersections along a common edge or at a common vertex.) The paper also describes the derivation of a bound for normal vectors that can be used for this purpose. This bound is relevant both in the case of trimmed-NURBS representations, and in the case of combined subdivision with trimming

    Distance based heterogeneous volume modelling.

    Get PDF
    Natural objects, such as bones and watermelons, often have a heterogeneous composition and complex internal structures. Material properties inside the object can change abruptly or gradually, and representing such changes digitally can be problematic. Attribute functions represent physical properties distribution in the volumetric object. Modelling complex attributes within a volume is a complex task. There are several approaches to modelling attributes, but distance functions have gained popularity for heterogeneous object modelling because, in addition to their usefulness, they lead to predictability and intuitiveness. In this thesis, we consider a unified framework for heterogeneous volume modelling, specifically using distance fields. In particular, we tackle various issues associated with them such as the interpolation of volumetric attributes through time for shape transformation and intuitive and predictable interpolation of attributes inside a shape. To achieve these results, we rely on smooth approximate distance fields and interior distances. This thesis deals with outstanding issues in heterogeneous object modelling, and more specifically in modelling functionally graded materials and structures using different types of distances and approximation thereof. We demonstrate the benefits of heterogeneous volume modelling using smooth approximate distance fields with various applications, such as adaptive microstructures, morphological shape generation, shape driven interpolation of material properties through time and shape conforming interpolation of properties. Distance based modelling of attributes allows us to have a better parametrization of the object volume and design gradient properties across an object. This becomes more important nowadays with the growing interest in rapid prototyping and digital fabrication of heterogeneous objects and can find practical applications in different industries

    Theory and applications of bijective barycentric mappings

    Get PDF
    Barycentric coordinates provide a convenient way to represent a point inside a triangle as a convex combination of the triangle's vertices, and to linearly interpolate data given at these vertices. Due to their favourable properties, they are commonly applied in geometric modelling, finite element methods, computer graphics, and many other fields. In some of these applications it is desirable to extend the concept of barycentric coordinates from triangles to polygons. Several variants of such generalized barycentric coordinates have been proposed in recent years. An important application of barycentric coordinates consists of barycentric mappings, which allow to naturally warp a source polygon to a corresponding target polygon, or more generally, to create mappings between closed curves or polyhedra. The principal practical application is image warping, which takes as input a control polygon drawn around an image and smoothly warps the image by moving the polygon vertices. A required property of image warping is to avoid fold-overs in the resulting image. The problem of fold-overs is a manifestation of a larger problem related to the lack of bijectivity of the barycentric mapping. Unfortunately, bijectivity of such barycentric mappings can only be guaranteed for the special case of warping between convex polygons or by triangulating the domain and hence renouncing smoothness. In fact, for any barycentric coordinates, it is always possible to construct a pair of polygons such that the barycentric mapping is not bijective. In the first part of this thesis we illustrate three methods to achieve bijective mappings. The first method is based on the intuition that, if two polygons are sufficiently close, then the mapping is close to the identity and hence bijective. This suggests to ``split'' the mapping into several intermediate mappings and to create a composite barycentric mapping which is guaranteed to be bijective between arbitrary polygons, polyhedra, or closed planar curves. We provide theoretical bounds on the bijectivity of the composite mapping related to the norm of the gradient of the coordinates. The fact that the bound depends on the gradient implies that these bounds exist only if the gradient of the coordinates is bounded. We focus on mean value coordinates and analyse the behaviour of their directional derivatives and gradient at the vertices of a polygon. The composition of barycentric mappings for closed planar curves leads to the problem of blending between two planar curves. We suggest to solve it by linearly interpolating the signed curvature and then reconstructing the intermediate curve from the interpolated curvature values. However, when both input curves are closed, this strategy can lead to open intermediate curves. We present a new algorithm for solving this problem, which finds the closed curve whose curvature is closest to the interpolated values. Our method relies on the definition of a suitable metric for measuring the distance between two planar curves and an appropriate discretization of the signed curvature functions. The second method to construct smooth bijective mappings with prescribed behaviour along the domain boundary exploits the properties of harmonic maps. These maps can be approximated in different ways, and we discuss their respective advantages and disadvantages. We further present a simple procedure for reducing their distortion and demonstrate the effectiveness of our approach by providing examples. The last method relies on a reformulation of complex barycentric mappings, which allows us to modify the ``speed'' along the edges to create complex bijective mappings. We provide some initial results and an optimization procedure which creates complex bijective maps. In the second part we provide two main applications of bijective mapping. The first one is in the context of finite elements simulations, where the discretization of the computational domain plays a central role. In the standard discretization, the domain is triangulated with a mesh and its boundary is approximated by a polygon. We present an approach which combines parametric finite elements with smooth bijective mappings, leaving the choice of approximation spaces free. This approach allows to represent arbitrarily complex geometries on coarse meshes with curved edges, regardless of the domain boundary complexity. The main idea is to use a bijective mapping for automatically warping the volume of a simple parametrization domain to the complex computational domain, thus creating a curved mesh of the latter. The second application addresses the meshing problem and the possibility to solve finite element simulations on polygonal meshes. In this context we present several methods to discretize the bijective mapping to create polygonal and piece-wise polynomial meshes

    Development and Monte Carlo validation of a finite element reactor analysis framework

    Get PDF
    This study presents the development and Monte Carlo validation of a continuous Galerkin finite element reactor analysis framework. In its current state, the framework acts as an interface between the mesh preparation software GMSH and the sparse linear solvers in MATLAB, for the discretization and approximation of 1-D, 2-D, and 3-D linear partial differential equations. Validity of the framework is assessed from the following two benchmarking activities: the 2-D IAEA PWR benchmark; and the 2-D Missouri Science and Technology Reactor benchmark proposed within this study. The 2-D IAEA PWR multi-group diffusion benchmark is conducted with the following discretization schemes: linear, quadratic, and cubic triangular elements; linear and quadratic rectangular elements of mesh sizes 10, 5, 2, 1, 0.5 cm. Convergence to the reference criticality eigenvalue of 1.02985 is observed for all cases. The proposed 2-D MSTR benchmark is prepared through translation of an experimentally validated 120w core configuration MCNP model into Serpent 2. Validation of the Serpent 2 model is attained from the comparison of criticality eigenvalues, flux traverses, and two 70-group energy spectrums within fuel elements D5 and D9. Then, a two-group 2-D MSTR benchmark of the 120w core configuration is prepared with the spatial homogenization methodology implemented within Serpent 2. Final validation of the framework is assessed from the comparison of criticality eigenvalues and spatial flux solutions of the diffusion and simplified spherical harmonics SP3 models. The diffusion model resulted in a difference in reactivity of Δρ =-1673.93 pcm and the SP3 model resulted in a difference of Δρ = -777.60 pcm with respect to the Serpent 2 criticality eigenvalues --Abstract, page iii
    corecore