2,313 research outputs found

    Monotonicity preserving approximation of multivariate scattered data

    Full text link
    This paper describes a new method of monotone interpolation and smoothing of multivariate scattered data. It is based on the assumption that the function to be approximated is Lipschitz continuous. The method provides the optimal approximation in the worst case scenario and tight error bounds. Smoothing of noisy data subject to monotonicity constraints is converted into a quadratic programming problem. Estimation of the unknown Lipschitz constant from the data by sample splitting and cross-validation is described. Extension of the method for locally Lipschitz functions is presented.<br /

    Weighted Quasi Interpolant Spline Approximations: Properties and Applications

    Get PDF
    Continuous representations are fundamental for modeling sampled data and performing computations and numerical simulations directly on the model or its elements. To effectively and efficiently address the approximation of point clouds we propose the Weighted Quasi Interpolant Spline Approximation method (wQISA). We provide global and local bounds of the method and discuss how it still preserves the shape properties of the classical quasi-interpolation scheme. This approach is particularly useful when the data noise can be represented as a probabilistic distribution: from the point of view of nonparametric regression, the wQISA estimator is robust to random perturbations, such as noise and outliers. Finally, we show the effectiveness of the method with several numerical simulations on real data, including curve fitting on images, surface approximation and simulation of rainfall precipitations

    Hägusad teist liiki integraalvõrrandid

    Get PDF
    Käesolevas doktoritöös on uuritud hägusaid teist liiki integraalvõrrandeid. Need võrrandid sisaldavad hägusaid funktsioone, s.t. funktsioone, mille väärtused on hägusad arvud. Me tõestasime tulemuse sileda tuumaga hägusate Volterra integraalvõrrandite lahendite sileduse kohta. Kui integraalvõrrandi tuum muudab märki, siis integraalvõrrandi lahend pole üldiselt sile. Nende võrrandite lahendamiseks me vaatlesime kollokatsioonimeetodit tükiti lineaarsete ja tükiti konstantsete funktsioonide ruumis. Kasutades lahendi sileduse tulemusi tõestasime meetodite koonduvuskiiruse. Me vaatlesime ka nõrgalt singulaarse tuumaga hägusaid Volterra integraalvõrrandeid. Uurisime lahendi olemasolu, ühesust, siledust ja hägusust. Ülesande ligikaudseks lahendamiseks kasutasime kollokatsioonimeetodit tükiti polünoomide ruumis. Tõestasime meetodite koonduvuskiiruse ning uurisime lähislahendi hägusust. Nii analüüs kui ka numbrilised eksperimendid näitavad, et gradueeritud võrke kasutades saame parema koonduvuskiiruse kui ühtlase võrgu korral. Teist liiki hägusate Fredholmi integraalvõrrandite lahendamiseks pakkusime uue lahendusmeetodi, mis põhineb kõigi võrrandis esinevate funktsioonide lähendamisel Tšebõšovi polünoomidega. Uurisime nii täpse kui ka ligikaudse lahendi olemasolu ja ühesust. Tõestasime meetodi koonduvuse ja lähislahendi hägususe.In this thesis we investigated fuzzy integral equations of the second kind. These equations contain fuzzy functions, i.e. functions whose values are fuzzy numbers. We proved a regularity result for solution of fuzzy Volterra integral equations with smooth kernels. If the kernel changes sign, then the solution is not smooth in general. We proposed collocation method with triangular and rectangular basis functions for solving these equations. Using the regularity result we estimated the order of convergence of these methods. We also investigated fuzzy Volterra integral equations with weakly singular kernels. The existence, regularity and the fuzziness of the exact solution is studied. Collocation methods on discontinuous piecewise polynomial spaces are proposed. A convergence analysis is given. The fuzziness of the approximate solution is investigated. Both the analysis and numerical methods show that graded mesh is better than uniform mesh for these problems. We proposed a new numerical method for solving fuzzy Fredholm integral equations of the second kind. This method is based on approximation of all functions involved by Chebyshev polynomials. We analyzed the existence and uniqueness of both exact and approximate fuzzy solutions. We proved the convergence and fuzziness of the approximate solution.https://www.ester.ee/record=b539569

    Video Data Compression by Progressive Iterative Approximation

    Get PDF
    In the present paper, the B-spline curve is used for reducing the entropy of video data. We consider the color or luminance variations of a spatial position in a series of frames as input data points in Euclidean space R or R3. The progressive and iterative approximation (PIA) method is a direct and intuitive way of generating curve series of high and higher fitting accuracy. The video data points are approximated using progressive and iterative approximation for least square (LSPIA) fitting. The Lossless video data compression is done through storing the B-spline curve control points (CPs) and the difference between fitted and original video data. The proposed method is applied to two classes of synthetically produced and naturally recorded video sequences and makes a reduction in the entropy of both. However, this reduction is higher for syntactically created than those naturally produced. The comparative analysis of experiments on a variety of video sequences suggests that the entropy of output video data is much less than that of input video data

    Computerized Analysis of Magnetic Resonance Images to Study Cerebral Anatomy in Developing Neonates

    Get PDF
    The study of cerebral anatomy in developing neonates is of great importance for the understanding of brain development during the early period of life. This dissertation therefore focuses on three challenges in the modelling of cerebral anatomy in neonates during brain development. The methods that have been developed all use Magnetic Resonance Images (MRI) as source data. To facilitate study of vascular development in the neonatal period, a set of image analysis algorithms are developed to automatically extract and model cerebral vessel trees. The whole process consists of cerebral vessel tracking from automatically placed seed points, vessel tree generation, and vasculature registration and matching. These algorithms have been tested on clinical Time-of- Flight (TOF) MR angiographic datasets. To facilitate study of the neonatal cortex a complete cerebral cortex segmentation and reconstruction pipeline has been developed. Segmentation of the neonatal cortex is not effectively done by existing algorithms designed for the adult brain because the contrast between grey and white matter is reversed. This causes pixels containing tissue mixtures to be incorrectly labelled by conventional methods. The neonatal cortical segmentation method that has been developed is based on a novel expectation-maximization (EM) method with explicit correction for mislabelled partial volume voxels. Based on the resulting cortical segmentation, an implicit surface evolution technique is adopted for the reconstruction of the cortex in neonates. The performance of the method is investigated by performing a detailed landmark study. To facilitate study of cortical development, a cortical surface registration algorithm for aligning the cortical surface is developed. The method first inflates extracted cortical surfaces and then performs a non-rigid surface registration using free-form deformations (FFDs) to remove residual alignment. Validation experiments using data labelled by an expert observer demonstrate that the method can capture local changes and follow the growth of specific sulcus

    Designing of objects using smooth cubic splines

    Get PDF

    Designing of objects using smooth cubic splines

    Get PDF

    Feasible Form Parameter Design of Complex Ship Hull Form Geometry

    Get PDF
    This thesis introduces a new methodology for robust form parameter design of complex hull form geometry via constraint programming, automatic differentiation, interval arithmetic, and truncated hierarchical B- splines. To date, there has been no clearly stated methodology for assuring consistency of general (equality and inequality) constraints across an entire geometric form parameter ship hull design space. In contrast, the method to be given here can be used to produce guaranteed narrowing of the design space, such that infeasible portions are eliminated. Furthermore, we can guarantee that any set of form parameters generated by our method will be self consistent. It is for this reason that we use the title feasible form parameter design. In form parameter design, a design space is represented by a tuple of design parameters which are extended in each design space dimension. In this representation, a single feasible design is a consistent set of real valued parameters, one for every component of the design space tuple. Using the methodology to be given here, we pick out designs which consist of consistent parameters, narrowed to any desired precision up to that of the machine, even for equality constraints. Furthermore, the method is developed to enable the generation of complex hull forms using an extension of the basic rules idea to allow for automated generation of rules networks, plus the use of the truncated hierarchical B-splines, a wavelet-adaptive extension of standard B-splines and hierarchical B-splines. The adaptive resolution methods are employed in order to allow an automated program the freedom to generate complex B-spline representations of the geometry in a robust manner across multiple levels of detail. Thus two complementary objectives are pursued: ensuring feasible starting sets of form parameters, and enabling the generation of complex hull form geometry
    corecore