56 research outputs found

    Preconditioned conjugate gradient method for generalized least squares problems

    Get PDF
    AbstractA variant of the preconditioned conjugate gradient method to solve generalized least squares problems is presented. If the problem is min (Ax − b)TW−1(Ax − b) with A ∈ Rm×n and W ∈ Rm×m symmetric and positive definite, the method needs only a preconditioner A1 ∈ Rn×n, but not the inverse of matrix W or of any of its submatrices. Freund's comparison result for regular least squares problems is extended to generalized least squares problems. An error bound is also given

    A New Criterion for Optimality in Nonlinear Programming

    Get PDF
    We establish a suïŹƒcient condition for the existence of minimizers of real-valued convex functions on closed subsets of ïŹnite dimensional spaces. We compare this condition with other related results

    A Characterization Of The Set Of Fixed Points Of Some Smoothed Operators

    No full text
    We characterize the set F of fixed points of an operator T(x) = SQ(x), where S is a positive definite, symmetric, and stochastic matrix and Q is a convex combination of orthogonal projections onto closed convex sets. We show that F is the set of minimizers of a convex function: the sum of a weighted average of the squares of the distances to the convex sets and a nonnegative quadratic related to the matrix S. © 1992.177C15De Pierro, Iusem, On the asymptotic behavior of some alternate smoothing series expansion iterative methods (1990) Linear Algebra Appl., 130, pp. 3-24Herman, (1980) Image Reconstruction from Projections: The Fundamentals of Computerized Tomography, , Academic, New Yor

    Minimization Of Nonsmooth Convex Functionals In Banach Spaces

    No full text
    We develop a unified framework for convergence analysis of subgradient and subgradient projection methods for minimization of nonsmooth convex functionals in Banach spaces. The important novel features of our analysis are that we neither assume that the functional is uniformly or strongly convex, nor use regularization techniques. Moreover, no boundedness assumptions are made on the level sets of the functional or the feasible set of the problem. In fact, the solution set can be unbounded. Under very mild assumptions, we prove that the sequence of iterates is bounded and it has at least one weak accumulation point which is a minimizer. Moreover, all weak accumulation points of the sequence of Ces`aro averages of the iterates are solutions of the minimization problem. Under certain additional assumptions (which are satisfied for several important instances of Banach spaces), we are able to exhibit weak convergence of the whole sequence of iterates to one of the solutions of the optimiza..

    Singularities of monotone vector fields and an extragradient-type algorithm

    No full text
    Abstract. Bearing in mind the notion of monotone vector field on Riemannian manifolds, see [12–16], we study the set of their singularities and for a particular class of manifolds develop an extragradient-type algorithm convergent to singularities of such vector fields. In particular, our method can be used for solving nonlinear constrained optimization problems in Euclidean space, with a convex objective function and the constraint set a constant curvature Hadamard manifold. Our paper shows how tools of convex analysis on Riemannian manifolds can be used to solve some nonconvex constrained problem in a Euclidean space
    • 

    corecore