4,265 research outputs found

    A Spectral Dai-Yuan-Type Conjugate Gradient Method for Unconstrained Optimization

    Get PDF
    A new spectral conjugate gradient method (SDYCG) is presented for solving unconstrained optimization problems in this paper. Our method provides a new expression of spectral parameter. This formula ensures that the sufficient descent condition holds. The search direction in the SDYCG can be viewed as a combination of the spectral gradient and the Dai-Yuan conjugate gradient. The global convergence of the SDYCG is also obtained. Numerical results show that the SDYCG may be capable of solving large-scale nonlinear unconstrained optimization problems

    A quasi-Newton proximal splitting method

    Get PDF
    A new result in convex analysis on the calculation of proximity operators in certain scaled norms is derived. We describe efficient implementations of the proximity calculation for a useful class of functions; the implementations exploit the piece-wise linear nature of the dual problem. The second part of the paper applies the previous result to acceleration of convex minimization problems, and leads to an elegant quasi-Newton method. The optimization method compares favorably against state-of-the-art alternatives. The algorithm has extensive applications including signal processing, sparse recovery and machine learning and classification

    An optimal subgradient algorithm for large-scale convex optimization in simple domains

    Full text link
    This paper shows that the optimal subgradient algorithm, OSGA, proposed in \cite{NeuO} can be used for solving structured large-scale convex constrained optimization problems. Only first-order information is required, and the optimal complexity bounds for both smooth and nonsmooth problems are attained. More specifically, we consider two classes of problems: (i) a convex objective with a simple closed convex domain, where the orthogonal projection on this feasible domain is efficiently available; (ii) a convex objective with a simple convex functional constraint. If we equip OSGA with an appropriate prox-function, the OSGA subproblem can be solved either in a closed form or by a simple iterative scheme, which is especially important for large-scale problems. We report numerical results for some applications to show the efficiency of the proposed scheme. A software package implementing OSGA for above domains is available

    Optimization algorithms for the solution of the frictionless normal contact between rough surfaces

    Get PDF
    This paper revisits the fundamental equations for the solution of the frictionless unilateral normal contact problem between a rough rigid surface and a linear elastic half-plane using the boundary element method (BEM). After recasting the resulting Linear Complementarity Problem (LCP) as a convex quadratic program (QP) with nonnegative constraints, different optimization algorithms are compared for its solution: (i) a Greedy method, based on different solvers for the unconstrained linear system (Conjugate Gradient CG, Gauss-Seidel, Cholesky factorization), (ii) a constrained CG algorithm, (iii) the Alternating Direction Method of Multipliers (ADMM), and (iviv) the Non-Negative Least Squares (NNLS) algorithm, possibly warm-started by accelerated gradient projection steps or taking advantage of a loading history. The latter method is two orders of magnitude faster than the Greedy CG method and one order of magnitude faster than the constrained CG algorithm. Finally, we propose another type of warm start based on a refined criterion for the identification of the initial trial contact domain that can be used in conjunction with all the previous optimization algorithms. This method, called Cascade Multi-Resolution (CMR), takes advantage of physical considerations regarding the scaling of the contact predictions by changing the surface resolution. The method is very efficient and accurate when applied to real or numerically generated rough surfaces, provided that their power spectral density function is of power-law type, as in case of self-similar fractal surfaces.Comment: 38 pages, 11 figure

    Diagonal preconditioned conjugate gradient algorithm for unconstrained optimization

    Get PDF
    The nonlinear conjugate gradient (CG) methods have widely been used in solving unconstrained optimization problems. They are well-suited for large-scale optimization problems due to their low memory requirements and least computational costs. In this paper, a new diagonal preconditioned conjugate gradient (PRECG) algorithm is designed, and this is motivated by the fact that a pre-conditioner can greatly enhance the performance of the CG method. Under mild conditions, it is shown that the algorithm is globally convergent for strongly convex functions. Numerical results are presented to show that the new diagonal PRECG method works better than the standard CG method
    corecore