115 research outputs found

    Hybrid and Iteratively Reweighted Regularization by Unbiased Predictive Risk and Weighted GCV for Projected Systems

    Get PDF
    abstract: Tikhonov regularization for projected solutions of large-scale ill-posed problems is considered. The Golub{Kahan iterative bidiagonalization is used to project the problem onto a subspace and regularization then applied to nd a subspace approximation to the full problem. Determination of the regularization, parameter for the projected problem by unbiased predictive risk estimation, generalized cross validation, and discrepancy principle techniques is investigated. It is shown that the regularized parameter obtained by the unbiased predictive risk estimator can provide a good estimate which can be used for a full problem that is moderately to severely ill-posed. A similar analysis provides the weight parameter for the weighted generalized cross validation such that the approach is also useful in these cases, and also explains why the generalized cross validation without weighting is not always useful. All results are independent of whether systems are over- or underdetermined. Numerical simulations for standard one-dimensional test problems and two- dimensional data, for both image restoration and tomographic image reconstruction, support the analysis and validate the techniques. The size of the projected problem is found using an extension of a noise revealing function for the projected problem [I. Hn etynkov a, M. Ple singer, and Z. Strako s, BIT Numer. Math., 49 (2009), pp. 669{696]. Furthermore, an iteratively reweighted regularization approach for edge preserving regularization is extended for projected systems, providing stabilization of the solutions of the projected systems and reducing dependence on the determination of the size of the projected subspace

    Comparing RSVD and Krylov methods for linear inverse problems

    Get PDF
    In this work we address regularization parameter estimation for ill-posed linear inverse problems with an penalty. Regularization parameter selection is of utmost importance for all of inverse problems and estimating it generally relies on the experience of the practitioner. For regularization with an penalty there exist a lot of parameter selection methods that exploit the fact that the solution and the residual can be written in explicit form. Parameter selection methods are functionals that depend on the regularization parameter where the minimizer is the desired regularization parameter that should lead to a good solution. Evaluation of these parameter selection methods still requires solving the inverse problem multiple times. Efficient evaluation of the parameter selection methods can be done through model order reduction. Two popular model order reduction techniques are Lanczos based methods (a Krylov subspace method) and the Randomized Singular Value Decomposition (RSVD). In this work we compare the two approaches. We derive error bounds for the parameter selection methods using the RSVD. We compare the performance of the Lanczos process versus the performance of RSVD for efficient parameter selection. The RSVD algorithm we use i

    Stabilization Algorithms for Large-Scale Problems

    No full text
    • …
    corecore