2 research outputs found

    How grossone can be helpful to iteratively compute negative curvature directions

    Get PDF
    We consider an iterative computation of negative curvature directions, in large scale optimization frameworks. We show that to the latter purpose, borrowing the ideas in [1, 3] and [4], we can fruitfully pair the Conjugate Gradient (CG) method with a recently introduced numerical approach involving the use of grossone [5]. In particular, though in principle the CG method is well-posed only on positive definite linear systems, the use of grossone can enhance the performance of the CG, allowing the computation of negative curvature directions, too. The overall method in our proposal significantly generalizes the theory proposed for [1] and [3], and straightforwardly allows the use of a CG-based method on indefinite Newton’s equations

    Bridging the gap between Trust–Region Methods (TRMs) and Linesearch Based Methods (LBMs) for Nonlinear Programming: quadratic sub–problems

    Get PDF
    We consider the solution of a recurrent sub–problem within both constrained and unconstrained Nonlinear Programming: namely the minimization of a quadratic function subject to linear constraints. This problem appears in a number of LBM frameworks, and to some extent it reveals a close analogy with the solution of trust–region sub–problems. In particular, we refer to a structured quadratic problem where five linear inequality constraints are included. We show that our proposal retains an appreciable versatility, despite its particular structure, so that a number of different real instances may be reformulated following the pattern in our proposal. Moreover, we detail how to compute an exact global solution of our quadratic sub–problem, exploiting first order KKT conditions
    corecore