5,862 research outputs found

    On the behavior of the gradient norm in the steepest descent method

    Get PDF
    Il est connu que la norme du gradient peut ne pas se comporter de manière fiable comme critère d'arret en optimisation sans contraintes, et qu'elle présente souvent un comportement oscillatoire durant le processus d'optimisation. Dand ce projet, nous avons étudie les propriétés de la norme du gradient pour la méthode de la plus forte pente appliquée a des fonctions quadratiques. Nous avons aussi développé quelques observation générales applicables aux problèmes non-linéaires, mettant en corrélation la norme du gradient, la valeur de la fonction objective, et le chemin engendré par les itérés.It is well known that the norm of the gradient may be unreliable as a stopping test in unconstrained optimization, and that it often exhibits oscillations in the course of the optimization. In this paper we present results describing the properties of the gradient norm for the steepest descent method applied to quadratic objective functions. We also make some general observations that apply to nonlinear problems, relating the gradient norm, the objective function value, and the path generated by the iterates

    Geometrical inverse preconditioning for symmetric positive definite matrices

    Full text link
    We focus on inverse preconditioners based on minimizing F(X)=1cos(XA,I)F(X) = 1-\cos(XA,I), where XAXA is the preconditioned matrix and AA is symmetric and positive definite. We present and analyze gradient-type methods to minimize F(X)F(X) on a suitable compact set. For that we use the geometrical properties of the non-polyhedral cone of symmetric and positive definite matrices, and also the special properties of F(X)F(X) on the feasible set. Preliminary and encouraging numerical results are also presented in which dense and sparse approximations are included
    corecore