12,277 research outputs found

    Iterated regularization methods for solving inverse problems

    Get PDF
    Typical inverse problems are ill-posed which frequently leads to difficulties in calculatingnumerical solutions. A common approximation method to solve ill-posed inverse problemsis iterated Tikhonov-Lavrentiev regularization.We examine iterated Tikhonov-Lavrentiev regularization and show that, in the casethat regularity properties are not globally satisfied, certain projections of the error converge faster than the theoretical predictions of the global error. We also explore the sensitivity of iterated Tikhonov regularization to the choice of the regularization parameter. We show that by calculating higher order sensitivities we improve the accuracy. We present a simple to implement algorithm that calculates the iterated Tikhonov updates and the sensitivities to the regularization parameter. The cost of this new algorithm is one vector addition and one scalar multiplication per step more than the standard iterated Tikhonov calculation.In considering the inverse problem of inverting the Helmholz-differential filter (with filterradius δ), we propose iterating a modification to Tikhonov-Lavrentiev regularization (withregularization parameter α and J iteration steps). We show that this modification to themethod decreases the theoretical error bounds from O(α(δ^2 +1)) for Tikhonov regularizationto O((αδ^2)^(J+1) ). We apply this modified iterated Tikhonov regularization method to theLeray deconvolution model of fluid flow. We discretize the problem with finite elements inspace and Crank-Nicolson in time and show existence, uniqueness and convergence of thissolution.We examine the combination of iterated Tikhonov regularization, the L-curve method,a new stopping criterion, and a bootstrapping algorithm as a general solution method inbrain mapping. This method is a robust method for handling the difficulties associated withbrain mapping: uncertainty quantification, co-linearity of the data, and data noise. Weuse this method to estimate correlation coefficients between brain regions and a quantified performance as well as identify regions of interest for future analysis

    Projected Newton Method for noise constrained Tikhonov regularization

    Full text link
    Tikhonov regularization is a popular approach to obtain a meaningful solution for ill-conditioned linear least squares problems. A relatively simple way of choosing a good regularization parameter is given by Morozov's discrepancy principle. However, most approaches require the solution of the Tikhonov problem for many different values of the regularization parameter, which is computationally demanding for large scale problems. We propose a new and efficient algorithm which simultaneously solves the Tikhonov problem and finds the corresponding regularization parameter such that the discrepancy principle is satisfied. We achieve this by formulating the problem as a nonlinear system of equations and solving this system using a line search method. We obtain a good search direction by projecting the problem onto a low dimensional Krylov subspace and computing the Newton direction for the projected problem. This projected Newton direction, which is significantly less computationally expensive to calculate than the true Newton direction, is then combined with a backtracking line search to obtain a globally convergent algorithm, which we refer to as the Projected Newton method. We prove convergence of the algorithm and illustrate the improved performance over current state-of-the-art solvers with some numerical experiments

    Multidirectional Subspace Expansion for One-Parameter and Multiparameter Tikhonov Regularization

    Get PDF
    Tikhonov regularization is a popular method to approximate solutions of linear discrete ill-posed problems when the observed or measured data is contaminated by noise. Multiparameter Tikhonov regularization may improve the quality of the computed approximate solutions. We propose a new iterative method for large-scale multiparameter Tikhonov regularization with general regularization operators based on a multidirectional subspace expansion. The multidirectional subspace expansion may be combined with subspace truncation to avoid excessive growth of the search space. Furthermore, we introduce a simple and effective parameter selection strategy based on the discrepancy principle and related to perturbation results

    On a continuation approach in Tikhonov regularization and its application in piecewise-constant parameter identification

    Full text link
    We present a new approach to convexification of the Tikhonov regularization using a continuation method strategy. We embed the original minimization problem into a one-parameter family of minimization problems. Both the penalty term and the minimizer of the Tikhonov functional become dependent on a continuation parameter. In this way we can independently treat two main roles of the regularization term, which are stabilization of the ill-posed problem and introduction of the a priori knowledge. For zero continuation parameter we solve a relaxed regularization problem, which stabilizes the ill-posed problem in a weaker sense. The problem is recast to the original minimization by the continuation method and so the a priori knowledge is enforced. We apply this approach in the context of topology-to-shape geometry identification, where it allows to avoid the convergence of gradient-based methods to a local minima. We present illustrative results for magnetic induction tomography which is an example of PDE constrained inverse problem
    corecore