151 research outputs found

    Piecewise polynomial monotonic interpolation of 2D gridded data

    Get PDF
    International audienceA method for interpolating monotone increasing 2D scalar data with a monotone piecewise cubic C1^1-continuous surface is presented. Monotonicity is a sufficient condition for a function to be free of critical points inside its domain. The standard axial monotonicity for tensor-product surfaces is however too restrictive. We therefore introduce a more relaxed monotonicity constraint. We derive sufficient conditions on the partial derivatives of the interpolating function to ensure its monotonicity. We then develop two algorithms to effectively construct a monotone C1^1 surface composed of cubic triangular Bézier surfaces interpolating a monotone gridded data set. Our method enables to interpolate given topological data such as minima, maxima and saddle points at the corners of a rectangular domain without adding spurious extrema inside the function domain. Numerical examples are given to illustrate the performance of the algorithm

    Piecewise polynomial Reconstruction of Scalar Fields from Simplified Morse-Smale Complexes

    Get PDF
    International audienceMorse-Smale (MS) complexes have been proposed to visualize topological features of scalar fields defined on manifold domains. Herein, three main problems have been addressed in the past: (a) efficient computation of the initial combinatorial structure connecting the critical points; (b) simplification of these combinatorial structures; (c) reconstruction of a scalar field in accordance to the simplified Morse-Smale complex. The present paper faces the third problem by proposing a novel approach for computing a scalar field coherent with a given simplified MS complex that privileges the use of piecewise polynomial functions. Based on techniques borrowed from shape preserving design in Computer Aided Geometric Design, our method constructs the surface cell by cell using piecewise polynomial curves and surfaces. The benefit and limitations of using polynomials for reconstruction surfaces from topological data are presented in this paper

    A conservative reconstruction scheme for the interpolation of extensive quantities in the Lagrangian particle dispersion model FLEXPART

    Get PDF
    Lagrangian particle dispersion models require interpolation of all meteorological input variables to the position in space and time of computational particles. The widely used model FLEXPART uses linear interpolation for this purpose, implying that the discrete input fields contain point values. As this is not the case for precipitation (and other fluxes) which represent cell averages or integrals, a preprocessing scheme is applied which ensures the conservation of the integral quantity with the linear interpolation in FLEXPART, at least for the temporal dimension. However, this mass conservation is not ensured per grid cell, and the scheme thus has undesirable properties such as temporal smoothing of the precipitation rates. Therefore, a new reconstruction algorithm was developed, in two variants. It introduces additional supporting grid points in each time interval and is to be used with a piecewise linear interpolation to reconstruct the precipitation time series in FLEXPART. It fulfils the desired requirements by preserving the integral precipitation in each time interval, guaranteeing continuity at interval boundaries, and maintaining non-negativity. The function values of the reconstruction algorithm at the sub-grid and boundary grid points constitute the degrees of freedom, which can be prescribed in various ways. With the requirements mentioned it was possible to derive a suitable piecewise linear reconstruction. To improve the monotonicity behaviour, two versions of a filter were also developed that form a part of the final algorithm. Currently, the algorithm is meant primarily for the temporal dimension. It was shown to significantly improve the reconstruction of hourly precipitation time series from 3-hourly input data. Preliminary considerations for the extension to additional dimensions are also included as well as suggestions for a range of possible applications beyond the case of precipitation in a Lagrangian particle model

    Parametric Interpolation To Scattered Data [QA281. A995 2008 f rb].

    Get PDF
    Dua skema interpolasi berparameter yang mengandungi interpolasi global untuk data tersebar am dan interpolasi pengekalan-kepositifan setempat data tersebar positif dibincangkan. Two schemes of parametric interpolation consisting of a global scheme to interpolate general scattered data and a local positivity-preserving scheme to interpolate positive scattered data are described

    Advances in radial and spherical basis function interpolation

    Get PDF
    The radial basis function method is a widely used technique for interpolation of scattered data. The method is meshfree, easy to implement independently of the number of dimensions, and for certain types of basis functions it provides spectral accuracy. All these properties also apply to the spherical basis function method, but the class of applicable basis functions, positive definite functions on the sphere, is not as well studied and understood as the radial basis functions for the Euclidean space. The aim of this thesis is mainly to introduce new techniques for construction of Euclidean basis functions and to establish new criteria for positive definiteness of functions on spheres. We study multiply and completely monotone functions, which are important for radial basis function interpolation because their monotonicity properties are in some cases necessary and in some cases sufficient for the positive definiteness of a function. We enhance many results which were originally stated for completely monotone functions to the bigger class of multiply monotone functions and use those to derive new radial basis functions. Further, we study the connection of monotonicity properties and positive definiteness of spherical basis functions. In the processes several new sufficient and some new necessary conditions for positive definiteness of spherical radial functions are proven. We also describe different techniques of constructing new radial and spherical basis functions, for example shifts. For the shifted versions in the Euclidean space we prove conditions for positive definiteness, compute their Fourier transform and give integral representations. Furthermore, we prove that the cosine transforms of multiply monotone functions are positive definite under some mild extra conditions. Additionally, a new class of radial basis functions which is derived as the Fourier transforms of the generalised Gaussian φ(t) = e−tβ is investigated. We conclude with a comparison of the spherical basis functions, which we derived in this thesis and those spherical basis functions well known. For this numerical test a set of test functions as well as recordings of electroencephalographic data are used to evaluate the performance of the different basis functions

    Bayesian seismic tomography using normalizing flows

    Get PDF

    A Parametrization-Based Surface Reconstruction System for Triangular Mesh Simplification with Application to Large Scale Scenes

    Full text link
    The laser scanner is nowadays widely used to capture the geometry of art, animation maquettes, or large architectural, industrial, and land form models. It thus poses specific problems depending on the model scale. This thesis provides a solution for simplification of triangulated data and for surface reconstruction of large data sets, where feature edges provide an obvious segmentation structure. It also explores a new method for model segmentation, with the goal of applying multiresolution techniques to data sets characterized by curvy areas and the lack of clear demarcation features. The preliminary stage of surface segmentation, which takes as input single or multiple scan data files, generates surface patches which are processed independently. The surface components are mapped onto a two-dimensional domain with boundary constraints, using a novel parametrization weight coefficient. This stage generates valid parameter domain points, which can be fed as arguments to parametric modeling functions or surface approximation schemes. On this domain, our approach explores two types of remeshing. First, we generate points in a regular grid pattern, achieving multiresolution through a flexible grid step, which nevertheless is designed to produce a globally uniform resampling aspect. In this case, for reconstruction, we attempt to solve the open problem of border reconciliation across adjacent domains by retriangulating the border gap between the grid and the fixed irregular border. Alternatively, we straighten the domain borders in the parameter domain and coarsely triangulate the resulting simplified polygons, resampling the base domain triangles in a 1-4 subdivision pattern, achieving multiresolution from the number of subdivision steps. For mesh reconstruction, we use a linear interpolation method based on the original mesh triangles as control points on local planes, using a saved triangle correspondence between the original mesh and the parametric domain. We also use a region-wide approximation method, applied to the parameter grid points, which first generates data-trained control points, and then uses them to obtain the reconstruction values at the resamples. In the grid resampling scheme, due to the border constraints, the reassembly of the segmented, sequentially processed data sets is seamless. In the subdivision scheme, we align adjacent border fragments in the parameter space, and use a region-to-fragment map to achieve the same border reconstruction across two neighboring components. We successfully process data sets up to 1,000,000 points in one pass of our program, and are capable of assembling larger scenes from sequential runs. Our program consists of a single run, without intermediate storage. Where we process large input data files, we fragment the input using a nested application of our segmentation algorithm to reduce the size of the input scenes, and our pipeline reassembles the reconstruction output from multiple data files into a unique view
    corecore