8 research outputs found

    Variational image regularization with Euler's elastica using a discrete gradient scheme

    Full text link
    This paper concerns an optimization algorithm for unconstrained non-convex problems where the objective function has sparse connections between the unknowns. The algorithm is based on applying a dissipation preserving numerical integrator, the Itoh--Abe discrete gradient scheme, to the gradient flow of an objective function, guaranteeing energy decrease regardless of step size. We introduce the algorithm, prove a convergence rate estimate for non-convex problems with Lipschitz continuous gradients, and show an improved convergence rate if the objective function has sparse connections between unknowns. The algorithm is presented in serial and parallel versions. Numerical tests show its use in Euler's elastica regularized imaging problems and its convergence rate and compare the execution time of the method to that of the iPiano algorithm and the gradient descent and Heavy-ball algorithms

    A Fast Method to Segment Images with Additive Intensity Value

    Get PDF
    Master'sMASTER OF SCIENC

    A tight frame algorithm in image inpainting.

    Get PDF
    Cheng, Kei Tsi Daniel.Thesis (M.Phil.)--Chinese University of Hong Kong, 2007.Includes bibliographical references (leaves 45-49).Abstracts in English and Chinese.Abstract --- p.iAcknowledgement --- p.iiiChapter 1 --- Introduction --- p.1Chapter 2 --- Background Knowledge --- p.6Chapter 2.1 --- Image Restoration using Total Variation Norm --- p.6Chapter 2.2 --- An Example of Tight Frame system --- p.10Chapter 2.3 --- Sparse and compressed representation --- p.13Chapter 2.4 --- Existence of minimizer in convex analysis --- p.16Chapter 3 --- Tight Frame Based Minimization --- p.18Chapter 3.1 --- Tight Frames --- p.18Chapter 3.2 --- Minimization Problems and Algorithms --- p.19Chapter 3.3 --- Other Minimization Problems --- p.22Chapter 4 --- Algorithm from minimization problem 3 --- p.24Chapter 5 --- Algorithm from minimization problem 4 --- p.28Chapter 6 --- Convergence of Algorithm 2 --- p.31Chapter 6.1 --- Inner Iteration --- p.31Chapter 6.2 --- Outer Iteration --- p.33Chapter 6.2.1 --- Existence of minimizer --- p.33Chapter 7 --- Numerical Results --- p.37Chapter 8 --- Conclusion --- p.4

    Generalized averaged Gaussian quadrature and applications

    Get PDF
    A simple numerical method for constructing the optimal generalized averaged Gaussian quadrature formulas will be presented. These formulas exist in many cases in which real positive GaussKronrod formulas do not exist, and can be used as an adequate alternative in order to estimate the error of a Gaussian rule. We also investigate the conditions under which the optimal averaged Gaussian quadrature formulas and their truncated variants are internal

    MS FT-2-2 7 Orthogonal polynomials and quadrature: Theory, computation, and applications

    Get PDF
    Quadrature rules find many applications in science and engineering. Their analysis is a classical area of applied mathematics and continues to attract considerable attention. This seminar brings together speakers with expertise in a large variety of quadrature rules. It is the aim of the seminar to provide an overview of recent developments in the analysis of quadrature rules. The computation of error estimates and novel applications also are described
    corecore