130 research outputs found

    A weakly convergent fully inexact Douglas-Rachford method with relative error tolerance

    Full text link
    Douglas-Rachford method is a splitting algorithm for finding a zero of the sum of two maximal monotone operators. Each of its iterations requires the sequential solution of two proximal subproblems. The aim of this work is to present a fully inexact version of Douglas-Rachford method wherein both proximal subproblems are solved approximately within a relative error tolerance. We also present a semi-inexact variant in which the first subproblem is solved exactly and the second one inexactly. We prove that both methods generate sequences weakly convergent to the solution of the underlying inclusion problem, if any

    Linearly Convergent First-Order Algorithms for Semi-definite Programming

    Full text link
    In this paper, we consider two formulations for Linear Matrix Inequalities (LMIs) under Slater type constraint qualification assumption, namely, SDP smooth and non-smooth formulations. We also propose two first-order linearly convergent algorithms for solving these formulations. Moreover, we introduce a bundle-level method which converges linearly uniformly for both smooth and non-smooth problems and does not require any smoothness information. The convergence properties of these algorithms are also discussed. Finally, we consider a special case of LMIs, linear system of inequalities, and show that a linearly convergent algorithm can be obtained under a weaker assumption

    Global weak sharp minima for convex (semi-)infinite optimization problems

    Get PDF
    AbstractWe mainly consider global weak sharp minima for convex infinite and semi-infinite optimization problems (CIP). In terms of the normal cone, subdifferential and directional derivative, we provide several characterizations for (CIP) to have global weak sharp minimum property
    • …
    corecore