4 research outputs found

    Preconditioned iterative methods for solving linear least squares problems

    Full text link
    New preconditioning strategies for solving m × n overdetermined large and sparse linear least squares problems using the conjugate gradient for least squares (CGLS) method are described. First, direct preconditioning of the normal equations by the balanced incomplete factorization (BIF) for symmetric and positive definite matrices is studied, and a new breakdown-free strategy is proposed. Preconditioning based on the incomplete LU factors of an n × n submatrix of the system matrix is our second approach. A new way to find this submatrix based on a specific weighted transversal problem is proposed. Numerical experiments demonstrate different algebraic and implementational features of the new approaches and put them into the context of current progress in preconditioning of CGLS. It is shown, in particular, that the robustness demonstrated earlier by the BIF preconditioning strategy transfers into the linear least squares solvers and the use of the weighted transversal helps to improve the LU-based approach.This work was partially supported by Spanish grant MTM 2010-18674 and the project 13-06684S of the Grant agency of the Czech Republic.Bru García, R.; Marín Mateos-Aparicio, J.; Mas Marí, J.; Tuma, M. (2014). Preconditioned iterative methods for solving linear least squares problems. SIAM Journal on Scientific Computing. 36(4):2002-2022. https://doi.org/10.1137/130931588S2002202236

    A penalty method for PDE-constrained optimization in inverse problems

    Full text link
    Many inverse and parameter estimation problems can be written as PDE-constrained optimization problems. The goal, then, is to infer the parameters, typically coefficients of the PDE, from partial measurements of the solutions of the PDE for several right-hand-sides. Such PDE-constrained problems can be solved by finding a stationary point of the Lagrangian, which entails simultaneously updating the paramaters and the (adjoint) state variables. For large-scale problems, such an all-at-once approach is not feasible as it requires storing all the state variables. In this case one usually resorts to a reduced approach where the constraints are explicitly eliminated (at each iteration) by solving the PDEs. These two approaches, and variations thereof, are the main workhorses for solving PDE-constrained optimization problems arising from inverse problems. In this paper, we present an alternative method that aims to combine the advantages of both approaches. Our method is based on a quadratic penalty formulation of the constrained optimization problem. By eliminating the state variable, we develop an efficient algorithm that has roughly the same computational complexity as the conventional reduced approach while exploiting a larger search space. Numerical results show that this method indeed reduces some of the non-linearity of the problem and is less sensitive the initial iterate

    Updating preconditioners for modified least squares problems

    Full text link
    [EN] In this paper, we analyze how to update incomplete Cholesky preconditioners to solve least squares problems using iterative methods when the set of linear relations is updated with some new information, a new variable is added or, contrarily, some information or variable is removed from the set. Our proposed method computes a low-rank update of the preconditioner using a bordering method which is inexpensive compared with the cost of computing a new preconditioner. Moreover, the numerical experiments presented show that this strategy gives, in many cases, a better preconditioner than other choices, including the computation of a new preconditioner from scratch or reusing an existing one.Partially supported by Spanish Grants MTM2014-58159-P and MTM2015-68805-REDT.Marín Mateos-Aparicio, J.; Mas Marí, J.; Guerrero-Flores, DJ.; Hayami, K. (2017). Updating preconditioners for modified least squares problems. Numerical Algorithms. 75(2):491-508. https://doi.org/10.1007/s11075-017-0315-zS491508752Alexander, S.T., Pan, C.T., Plemmons, R.J.: Analysis of a recursive least squares hyperbolic rotation algorithm for signal processing. Linear Algebra Appl. 98, 3–40 (1988)Andrew, R., Dingle, N.: Implementing QR factorization updating algorithms on GPUs. Parallel Comput. 40(7), 161–172 (2014). doi: 10.1016/j.parco.2014.03.003 . http://www.sciencedirect.com/science/article/pii/S0167819114000337 . 7th Workshop on Parallel Matrix Algorithms and ApplicationsBenzi, M., T˚uma, M.: A robust incomplete factorization preconditioner for positive definite matrices. Numer. Linear Algebra Appl. 10(5-6), 385–400 (2003)Benzi, M., Szyld, D.B., Van Duin, A.: Orderings for incomplete factorization preconditioning of nonsymmetric problems. SIAM J. Sci. Comput. 20(5), 1652–1670 (1999)Björck, Å.: Numerical methods for Least Squares Problems. SIAM, Philadelphia (1996)Bru, R., Marín, J., Mas, J., T˚uma, M.: Preconditioned iterative methods for solving linear least squares problems. SIAM J. Sci. Comput. 36(4), A2002–A2022 (2014)Cerdán, J., Marín, J., Mas, J.: Low-rank upyears of balanced incomplete factorization preconditioners. Numer. Algorithms. doi: 10.1007/s11075-016-0151-6 (2016)Chambers, J.M.: Regression updating. J. Amer. Statist. Assoc. 66, 744–748 (1971)Davis, T.A., Hu, Y.: The university of florida sparse matrix collection. ACM trans. Math. Software 38(1), 1–25 (2011)Davis, T.A., Hager, W.W.: Modifying a sparse Cholesky factorization. SIAM J. Matrix Anal. Appl. 20, 606–627 (1999)Davis, T.A., Hager, W.W.: Multiple-rank modifications of a sparse Cholesky factorization. SIAM J. Matrix Anal. Appl. 22, 997–1013 (2001)Davis, T.A., Hager, W.W.: Row modification of a sparse Cholesky factorization. SIAM J. Matrix Anal. Appl. 26, 621–639 (2005)Hammarling, S., Lucas, C.: Updating the QR factorization and the least squares problem. Tech. rep., The University of Manchester, http://www.manchester.ac.uk/mims/eprints (2008)Olsson, O., Ivarsson, T.: Using the QR factorization to swiftly upyear least squares problems. Thesis report, Centre for Mathematical Sciences. The Faculty of Engineering at Lund University LTH (2014)Pothen, A., Fan, C.J.: Computing the block triangular form of a sparse matrix. ACM Trans. Math. Software 16, 303–324 (1990)Saad, Y.: ILUT: A dual threshold incomplete LU factorization. Numer. Linear Algebra Appl. 1(4), 387–402 (1994)Saad, Y.: Iterative Methods for Sparse Linear Systems. PWS Publishing Co., Boston (1996

    Preconditioners for rank deficient least squares problems

    Full text link
    [EN] In this paper we present a method for computing sparse preconditioners for iteratively solving rank deficient least squares problems (LS) by the LSMR method. The main idea of the method proposed is to update an incomplete factorization computed for a regularized problem to recover the solution of the original one. The numerical experiments for a wide set of matrices arising from different science and engineering applications show that the preconditioner proposed, in most cases, can be successfully applied to accelerate the convergence of the iterative Krylov subspace method.This work was supported by the Spanish Ministerio de Economia, Industria y Competitividad, Spain under grants MTM2017-85669-P and MTM2017-90682-REDT.Cerdán Soriano, JM.; Guerrero, D.; Marín Mateos-Aparicio, J.; Mas Marí, J. (2020). Preconditioners for rank deficient least squares problems. Journal of Computational and Applied Mathematics. 372:1-11. https://doi.org/10.1016/j.cam.2019.112621S111372Paige, C. C., & Saunders, M. A. (1982). LSQR: An Algorithm for Sparse Linear Equations and Sparse Least Squares. ACM Transactions on Mathematical Software, 8(1), 43-71. doi:10.1145/355984.355989Paige, C. C., & Saunders, M. A. (1982). Algorithm 583: LSQR: Sparse Linear Equations and Least Squares Problems. ACM Transactions on Mathematical Software, 8(2), 195-209. doi:10.1145/355993.356000Golub, G., & Kahan, W. (1965). Calculating the Singular Values and Pseudo-Inverse of a Matrix. Journal of the Society for Industrial and Applied Mathematics Series B Numerical Analysis, 2(2), 205-224. doi:10.1137/0702016Fong, D. C.-L., & Saunders, M. (2011). LSMR: An Iterative Algorithm for Sparse Least-Squares Problems. SIAM Journal on Scientific Computing, 33(5), 2950-2971. doi:10.1137/10079687xScott, J. (2017). On Using Cholesky-Based Factorizations and Regularization for Solving Rank-Deficient Sparse Linear Least-Squares Problems. SIAM Journal on Scientific Computing, 39(4), C319-C339. doi:10.1137/16m1065380HSL, A collection of Fortran codes for large scale scientific computation. http://www.hsl.rl.ac.uk/.Li, N., & Saad, Y. (2006). MIQR: A Multilevel Incomplete QR Preconditioner for Large Sparse Least‐Squares Problems. SIAM Journal on Matrix Analysis and Applications, 28(2), 524-550. doi:10.1137/050633032Benzi, M., & T?ma, M. (2003). A robust incomplete factorization preconditioner for positive definite matrices. Numerical Linear Algebra with Applications, 10(5-6), 385-400. doi:10.1002/nla.320Hayami, K., Yin, J.-F., & Ito, T. (2010). GMRES Methods for Least Squares Problems. SIAM Journal on Matrix Analysis and Applications, 31(5), 2400-2430. doi:10.1137/070696313R. Bru, J. Marín, J. Mas, M. Tůma, Preconditioned iterative methods for solving linear least squares problems, SIAM J. Sci. Comput. 36 (4).Gould, N., & Scott, J. (2017). The State-of-the-Art of Preconditioners for Sparse Linear Least-Squares Problems. ACM Transactions on Mathematical Software, 43(4), 1-35. doi:10.1145/3014057Cerdán, J., Marín, J., & Mas, J. (2016). Low-rank updates of balanced incomplete factorization preconditioners. Numerical Algorithms, 74(2), 337-370. doi:10.1007/s11075-016-0151-6Davis, T. A., & Hu, Y. (2011). The university of Florida sparse matrix collection. ACM Transactions on Mathematical Software, 38(1), 1-25. doi:10.1145/2049662.2049663Pothen, A., & Fan, C.-J. (1990). Computing the block triangular form of a sparse matrix. ACM Transactions on Mathematical Software, 16(4), 303-324. doi:10.1145/98267.98287Arridge, S. R., Betcke, M. M., & Harhanen, L. (2014). Iterated preconditioned LSQR method for inverse problems on unstructured grids. Inverse Problems, 30(7), 075009. doi:10.1088/0266-5611/30/7/07500
    corecore