34,057 research outputs found

    Generalized semi-infinite programming: Numerical aspects

    Get PDF
    Generalized semi-infinite optimization problems (GSIP) are considered. It is investigated how the numerical methods for standard semi-infinite programming (SIP) can be extended to GSIP. Newton methods can be extended immediately. For discretization methods the situation is more complicated. These difficulties are discussed and convergence results for a discretization and an exchange method are derived under fairly general assumptions. The question under which conditions GSIP represents a convex problem is answered

    Optimality certificates for convex minimization and Helly numbers

    Full text link
    We consider the problem of minimizing a convex function over a subset of R^n that is not necessarily convex (minimization of a convex function over the integer points in a polytope is a special case). We define a family of duals for this problem and show that, under some natural conditions, strong duality holds for a dual problem in this family that is more restrictive than previously considered duals.Comment: 5 page

    On representations of the feasible set in convex optimization

    Full text link
    We consider the convex optimization problem min{f(x):gj(x)0,j=1,...,m}\min \{f(x) : g_j(x)\leq 0, j=1,...,m\} where ff is convex, the feasible set K is convex and Slater's condition holds, but the functions gjg_j are not necessarily convex. We show that for any representation of K that satisfies a mild nondegeneracy assumption, every minimizer is a Karush-Kuhn-Tucker (KKT) point and conversely every KKT point is a minimizer. That is, the KKT optimality conditions are necessary and sufficient as in convex programming where one assumes that the gjg_j are convex. So in convex optimization, and as far as one is concerned with KKT points, what really matters is the geometry of K and not so much its representation.Comment: to appear in Optimization Letter

    Tangential Extremal Principles for Finite and Infinite Systems of Sets, II: Applications to Semi-infinite and Multiobjective Optimization

    Get PDF
    This paper contains selected applications of the new tangential extremal principles and related results developed in Part I to calculus rules for infinite intersections of sets and optimality conditions for problems of semi-infinite programming and multiobjective optimization with countable constraint

    Inexact Convex Relaxations for AC Optimal Power Flow: Towards AC Feasibility

    Full text link
    Convex relaxations of AC optimal power flow (AC-OPF) problems have attracted significant interest as in several instances they provably yield the global optimum to the original non-convex problem. If, however, the relaxation is inexact, the obtained solution is not AC-feasible. The quality of the obtained solution is essential for several practical applications of AC-OPF, but detailed analyses are lacking in existing literature. This paper aims to cover this gap. We provide an in-depth investigation of the solution characteristics when convex relaxations are inexact, we assess the most promising AC feasibility recovery methods for large-scale systems, and we propose two new metrics that lead to a better understanding of the quality of the identified solutions. We perform a comprehensive assessment on 96 different test cases, ranging from 14 to 3120 buses, and we show the following: (i) Despite an optimality gap of less than 1%, several test cases still exhibit substantial distances to both AC feasibility and local optimality and the newly proposed metrics characterize these deviations. (ii) Penalization methods fail to recover an AC-feasible solution in 15 out of 45 cases, and using the proposed metrics, we show that most failed test instances exhibit substantial distances to both AC-feasibility and local optimality. For failed test instances with small distances, we show how our proposed metrics inform a fine-tuning of penalty weights to obtain AC-feasible solutions. (iii) The computational benefits of warm-starting non-convex solvers have significant variation, but a computational speedup exists in over 75% of the cases
    corecore