2,292 research outputs found

    Algorithm 873: LSTRS: MATLAB Software for Large-Scale Trust-Region Subproblems and Regularization

    Get PDF
    A MATLAB 6.0 implementation of the LSTRS method is presented. LSTRS was described in Rojas et al. [2000]. LSTRS is designed for large-scale quadratic problems with one norm constraint. The method is based on a reformulation of the trust-region subproblem as a parameterized eigenvalue problem, and consists of an iterative procedure that finds the optimal value for the parameter. The adjustment of the parameter requires the solution of a large-scale eigenvalue problem at each step. LSTRS relies on matrix-vector products only and has low and fixed storage requirements, features that make it suitable for large-scale computations. In the MATLAB implementation, the Hessian matrix of the quadratic objective function can be specified either explicitly, or in the form of a matrix-vector multiplication routine. Therefore, the implementation preserves the matrix-free nature of the method. A description of the LSTRS method and of the MATLAB software, version 1.2, is presented. Comparisons with other techniques and applications of the method are also included. A guide for using the software and examples are provided.34

    Optimization Methods for Inverse Problems

    Full text link
    Optimization plays an important role in solving many inverse problems. Indeed, the task of inversion often either involves or is fully cast as a solution of an optimization problem. In this light, the mere non-linear, non-convex, and large-scale nature of many of these inversions gives rise to some very challenging optimization problems. The inverse problem community has long been developing various techniques for solving such optimization tasks. However, other, seemingly disjoint communities, such as that of machine learning, have developed, almost in parallel, interesting alternative methods which might have stayed under the radar of the inverse problem community. In this survey, we aim to change that. In doing so, we first discuss current state-of-the-art optimization methods widely used in inverse problems. We then survey recent related advances in addressing similar challenges in problems faced by the machine learning community, and discuss their potential advantages for solving inverse problems. By highlighting the similarities among the optimization challenges faced by the inverse problem and the machine learning communities, we hope that this survey can serve as a bridge in bringing together these two communities and encourage cross fertilization of ideas.Comment: 13 page

    An interior-point trust-region-based method for large-scale non-negative regularization

    Get PDF
    Abstract We present a new method for solving large-scale quadratic problems with quadratic and nonnegativity constraints. Such problems arise for example in the regularization of ill-posed problems in image restoration where, in addition, some of the matrices involved are very ill-conditioned. The new method uses recently developed techniques for the large-scale trust-region subproblem

    Accelerating the LSTRS Algorithm

    Get PDF
    In a recent paper [Rojas, Santos, Sorensen: ACM ToMS 34 (2008), Article 11] an efficient method for solvingthe Large-Scale Trust-Region Subproblem was suggested which is based on recasting it in terms of a parameter dependent eigenvalue problem and adjusting the parameter iteratively. The essential work at each iteration is the solution of an eigenvalue problem for the smallest eigenvalue of the Hessian matrix (or two smallest eigenvalues in the potential hard case) and associated eigenvector(s). Replacing the implicitly restarted Lanczos method in the original paper with the Nonlinear Arnoldi method makes it possible to recycle most of the work from previous iterations which can substantially accelerate LSTRS

    Fractional regularization matrices for linear discrete ill-posed problems

    Get PDF
    The numerical solution of linear discrete ill-posed problems typically requires regularization. Two of the most popular regularization methods are due to Tikhonov and Lavrentiev. These methods require the choice of a regularization matrix. Common choices include the identity matrix and finite difference approximations of a derivative operator. It is the purpose of the present paper to explore the use of fractional powers of the matrices {Mathematical expression} (for Tikhonov regularization) and A (for Lavrentiev regularization) as regularization matrices, where A is the matrix that defines the linear discrete ill-posed problem. Both small- and large-scale problems are considered. © 2013 Springer Science+Business Media Dordrecht

    Parametric Level Set Methods for Inverse Problems

    Full text link
    In this paper, a parametric level set method for reconstruction of obstacles in general inverse problems is considered. General evolution equations for the reconstruction of unknown obstacles are derived in terms of the underlying level set parameters. We show that using the appropriate form of parameterizing the level set function results a significantly lower dimensional problem, which bypasses many difficulties with traditional level set methods, such as regularization, re-initialization and use of signed distance function. Moreover, we show that from a computational point of view, low order representation of the problem paves the path for easier use of Newton and quasi-Newton methods. Specifically for the purposes of this paper, we parameterize the level set function in terms of adaptive compactly supported radial basis functions, which used in the proposed manner provides flexibility in presenting a larger class of shapes with fewer terms. Also they provide a "narrow-banding" advantage which can further reduce the number of active unknowns at each step of the evolution. The performance of the proposed approach is examined in three examples of inverse problems, i.e., electrical resistance tomography, X-ray computed tomography and diffuse optical tomography

    EIT Reconstruction Algorithms: Pitfalls, Challenges and Recent Developments

    Full text link
    We review developments, issues and challenges in Electrical Impedance Tomography (EIT), for the 4th Workshop on Biomedical Applications of EIT, Manchester 2003. We focus on the necessity for three dimensional data collection and reconstruction, efficient solution of the forward problem and present and future reconstruction algorithms. We also suggest common pitfalls or ``inverse crimes'' to avoid.Comment: A review paper for the 4th Workshop on Biomedical Applications of EIT, Manchester, UK, 200

    Modular Regularization Algorithms

    Get PDF
    • …
    corecore