1,192 research outputs found

    Stability of subdivision schemes

    Get PDF
    The stability of stationary interpolatory subdivision schemes for univariate data is investigated. If the subdivision scheme is linear, its stability follows from the convergence of the scheme, but for nonlinear subdivision schemes one needs stronger conditions and the stability analysis of nonlinear schemes is more involved. Apart from the fact that it is natural to demand that subdivision schemes are stable, it also has an advantage in a theoretical sense: is it shown that the approximation properties of stable schemes can very easily be determined

    Learning Theory and Approximation

    Get PDF
    The main goal of this workshop – the third one of this type at the MFO – has been to blend mathematical results from statistical learning theory and approximation theory to strengthen both disciplines and use synergistic effects to work on current research questions. Learning theory aims at modeling unknown function relations and data structures from samples in an automatic manner. Approximation theory is naturally used for the advancement and closely connected to the further development of learning theory, in particular for the exploration of new useful algorithms, and for the theoretical understanding of existing methods. Conversely, the study of learning theory also gives rise to interesting theoretical problems for approximation theory such as the approximation and sparse representation of functions or the construction of rich kernel reproducing Hilbert spaces on general metric spaces. This workshop has concentrated on the following recent topics: Pitchfork bifurcation of dynamical systems arising from mathematical foundations of cell development; regularized kernel based learning in the Big Data situation; deep learning; convergence rates of learning and online learning algorithms; numerical refinement algorithms to learning; statistical robustness of regularized kernel based learning

    Optimising Spatial and Tonal Data for PDE-based Inpainting

    Full text link
    Some recent methods for lossy signal and image compression store only a few selected pixels and fill in the missing structures by inpainting with a partial differential equation (PDE). Suitable operators include the Laplacian, the biharmonic operator, and edge-enhancing anisotropic diffusion (EED). The quality of such approaches depends substantially on the selection of the data that is kept. Optimising this data in the domain and codomain gives rise to challenging mathematical problems that shall be addressed in our work. In the 1D case, we prove results that provide insights into the difficulty of this problem, and we give evidence that a splitting into spatial and tonal (i.e. function value) optimisation does hardly deteriorate the results. In the 2D setting, we present generic algorithms that achieve a high reconstruction quality even if the specified data is very sparse. To optimise the spatial data, we use a probabilistic sparsification, followed by a nonlocal pixel exchange that avoids getting trapped in bad local optima. After this spatial optimisation we perform a tonal optimisation that modifies the function values in order to reduce the global reconstruction error. For homogeneous diffusion inpainting, this comes down to a least squares problem for which we prove that it has a unique solution. We demonstrate that it can be found efficiently with a gradient descent approach that is accelerated with fast explicit diffusion (FED) cycles. Our framework allows to specify the desired density of the inpainting mask a priori. Moreover, is more generic than other data optimisation approaches for the sparse inpainting problem, since it can also be extended to nonlinear inpainting operators such as EED. This is exploited to achieve reconstructions with state-of-the-art quality. We also give an extensive literature survey on PDE-based image compression methods

    Interpolating m-refinable functions with compact support: The second generation class

    Get PDF
    We present an algorithm for the construction of a new class of compactly supported interpolating refinable functions that we call the second generation class since, contrary to the existing class, is associated to subdivision schemes with an even-symmetric mask that does not contain the submask 0\u2026,0,1,0,\u20260. As application examples of the proposed algorithm we present interpolating 4-refinable functions that are generated by parameter-dependent, even-symmetric quaternary schemes never considered in the literature so far

    On four-point penalized Lagrange subdivision schemes

    No full text
    International audienceThis paper is devoted to the definition and analysis of new subdivision schemes called penalized Lagrange. Their construction is based on an originalreformulation for the construction of the coefficients of the mask associated to the classical 44-points Lagrange interpolatory subdivision scheme: these coefficients can be formallyinterpreted as the solution of a linear system similar to the one resulting from the constrained minimization problem in Kriging theory which is commonly used for reconstruction in geostatistical studies. In sucha framework, the introduction in the formulation of a so-called error variance can be viewed as a penalization of the oscillations of the coefficients.Following this idea, we propose to penalize the 44-points Lagrange system. This penalization transforms the interpolatory schemes into approximating ones with specific properties suitable for the subdivision of locallynoisy or strongly oscillating data. According to a so-called penalization vector, a family of schemes can be generated. A full theoretical study is first performed to analyze this new type of non stationary subdivision schemes. Then, in the framework of position dependant penalization vector, several numerical tests are provided to point out the efficiency of these schemes comparedto standard approaches

    A subdivision scheme for continuous-scale B-splines and affine invariant progressive

    Get PDF
    Caption title.Includes bibliographical references (p. 19-22).Partially supported by the Rothschild Foundation-Yad Hanadiv. Partially supported by the Army Research Office. DAAL03-92-G-0115Guillermo Sapiro, Albert Cohen, Alfred M. Bruckstein
    • …
    corecore