20 research outputs found

    Nonmonotone globalization techniques for the Barzilai-Borwein gradient method.

    Get PDF
    In this paper we propose new globalization strategies for the Barzilai and Borwein gradient method, based on suitable relaxations of the monotonicity requirements. In particular, we define a class of algorithms that combine nonmonotone watchdog techniques with nonmonotone linesearch rules and we prove the global convergence of these schemes. Then we perform an extensive computational study, which shows the effectiveness of the proposed approach in the solution of large dimensional unconstrained optimization problems

    Nonmonotone globalization techniques for the Barzilai-Borwein gradient method

    Get PDF
    ABSTRACT In this paper we propose new globalization strategies for the Barzilai and Borwein gradient method, based on suitable relaxations of the monotonicity requirements. In particular, we define a class of algorithms that combine nonmonotone watchdog techniques with nonmonotone linesearch rules and we prove the global convergence of these schemes. Then we perform an extensive computational study, which shows the effectiveness of the proposed approach in the solution of large dimensional unconstrained optimization problems

    Efficient Inexact Proximal Gradient Algorithm for Nonconvex Problems

    Full text link
    The proximal gradient algorithm has been popularly used for convex optimization. Recently, it has also been extended for nonconvex problems, and the current state-of-the-art is the nonmonotone accelerated proximal gradient algorithm. However, it typically requires two exact proximal steps in each iteration, and can be inefficient when the proximal step is expensive. In this paper, we propose an efficient proximal gradient algorithm that requires only one inexact (and thus less expensive) proximal step in each iteration. Convergence to a critical point %of the nonconvex problem is still guaranteed and has a O(1/k)O(1/k) convergence rate, which is the best rate for nonconvex problems with first-order methods. Experiments on a number of problems demonstrate that the proposed algorithm has comparable performance as the state-of-the-art, but is much faster

    On Iterative Algorithms for Quantitative Photoacoustic Tomography in the Radiative Transport Regime

    Full text link
    In this paper, we describe the numerical reconstruction method for quantitative photoacoustic tomography (QPAT) based on the radiative transfer equation (RTE), which models light propagation more accurately than diffusion approximation (DA). We investigate the reconstruction of absorption coefficient and/or scattering coefficient of biological tissues. Given the scattering coefficient, an improved fixed-point iterative method is proposed to retrieve the absorption coefficient for its cheap computational cost. And we prove the convergence. To retrieve two coefficients simultaneously, Barzilai-Borwein (BB) method is applied. Since the reconstruction of optical coefficients involves the solution of original and adjoint RTEs in the framework of optimization, an efficient solver with high accuracy is improved from~\cite{Gao}. Simulation experiments illustrate that the improved fixed-point iterative method and the BB method are the comparative methods for QPAT in two cases.Comment: 21 pages, 44 figure

    The Barzilai and Borwein gradient method with nonmonotone line search for nonsmooth convex optimization problems

    Get PDF
    The Barzilai and Borwein gradient algorithm has received a great deal of attention in recent decades since it is simple and effective for smooth optimization problems. Whether can it be extended to solve nonsmooth problems? In this paper, we answer this question positively. The Barzilai and Borwein gradient algorithm combined with a nonmonotone line search technique is proposed for nonsmooth convex minimization. The global convergence of the given algorithm is established under suitable conditions. Numerical results show that this method is efficient

    Nonmonotone globalization of the finite-difference Newton-GMRES method for nonlinear equations.

    Get PDF
    In this paper, we study nonmonotone globalization strategies, in connection with the finite-difference inexact Newton-GMRES method for nonlinear equations. We first define a globalization algorithm that combines nonmonotone watchdog rules and nonmonotone derivative-free linesearches related to a merit function, and prove its global convergence under the assumption that the Jacobian is nonsingular and that the iterations of the GMRES subspace method can be completed at each step. Then we introduce a hybrid stabilization scheme employing occasional line searches along positive bases, and establish global convergence towards a solution of the system, under the less demanding condition that the Jacobian is nonsingular at stationary points of the merit function. Through a set of numerical examples, we show that the proposed techniques may constitute useful options to be added in solvers for nonlinear systems of equations. © 2010 Taylor & Francis
    corecore