7,449 research outputs found

    On Some Generalized Polyhedral Convex Constructions

    Full text link
    Generalized polyhedral convex sets, generalized polyhedral convex functions on locally convex Hausdorff topological vector spaces, and the related constructions such as sum of sets, sum of functions, directional derivative, infimal convolution, normal cone, conjugate function, subdifferential, are studied thoroughly in this paper. Among other things, we show how a generalized polyhedral convex set can be characterized via the finiteness of the number of its faces. In addition, it is proved that the infimal convolution of a generalized polyhedral convex function and a polyhedral convex function is a polyhedral convex function. The obtained results can be applied to scalar optimization problems described by generalized polyhedral convex sets and generalized polyhedral convex functions

    Efficient Solutions in Generalized Linear Vector Optimization

    Full text link
    This paper establishes several new facts on generalized polyhedral convex sets and shows how they can be used in vector optimization. Among other things, a scalarization formula for the efficient solution sets of generalized vector optimization problems is obtained. We also prove that the efficient solution set of a generalized linear vector optimization problem in a locally convex Hausdorff topological vector space is the union of finitely many generalized polyhedral convex sets and it is connected by line segments

    Generalized Polyhedral Convex Optimization Problems

    Full text link
    Generalized polyhedral convex optimization problems in locally convex Hausdorff topological vector spaces are studied systematically in this paper. We establish solution existence theorems, necessary and sufficient optimality conditions, weak and strong duality theorems. In particular, we show that the dual problem has the same structure as the primal problem, and the strong duality relation holds under three different sets of conditions

    A Representation of Generalized Convex Polyhedra and Applications

    Full text link
    It is well known that finite-dimensional polyhedral convex sets can be generated by finitely many points and finitely many directions. Representation formulas in this spirit are obtained for convex polyhedra and generalized convex polyhedra in locally convex Hausdorff topological vector spaces. Our results develop those of X. Y. Zheng (Set-Valued Anal., Vol. 17, 2009, 389-408), which were established in a Banach space setting. Applications of the representation formulas to proving solution existence theorems for generalized linear programming problems and generalized linear vector optimization problems are shown

    Piecewise Linear Vector Optimization Problems on Locally Convex Hausdorff Topological Vector Spaces

    Full text link
    Piecewise linear vector optimization problems in a locally convex Hausdorff topological vector spaces setting are considered in this paper. The efficient solution set of these problems are shown to be the unions of finitely many semi-closed generalized polyhedral convex sets. If, in addition, the problem is convex, then the efficient solution set and the weakly efficient solution set are the unions of finitely many generalized polyhedral convex sets and they are connected by line segments. Our results develop the preceding ones of Zheng and Yang [Sci. China Ser. A. 51, 1243--1256 (2008)], and Yang and Yen [J. Optim. Theory Appl. 147, 113--124 (2010)], which were established in a normed spaces setting.Comment: accepted for publication in Acta Mathematica Vietnamic

    Optimality conditions based on the Fr\'echet second-order subdifferential

    Full text link
    This paper focuses on second-order necessary optimality conditions for constrained optimization problems on Banach spaces. For problems in the classical setting, where the objective function is C2C^2-smooth, we show that strengthened second-order necessary optimality conditions are valid if the constraint set is generalized polyhedral convex. For problems in a new setting, where the objective function is just assumed to be C1C^1-smooth and the constraint set is generalized polyhedral convex, we establish sharp second-order necessary optimality conditions based on the Fr\'echet second-order subdifferential of the objective function and the second-order tangent set to the constraint set. Three examples are given to show that the used hypotheses are essential for the new theorems. Our second-order necessary optimality conditions refine and extend several existing results

    A vector linear programming approach for certain global optimization problems

    Full text link
    Global optimization problems with a quasi-concave objective function and linear constraints are studied. We point out that various other classes of global optimization problems can be expressed in this way. We present two algorithms, which can be seen as slight modifications of Benson-type algorithms for multiple objective linear programs (MOLP). The modification of the MOLP algorithms results in a more efficient treatment of the studied optimization problems. This paper generalizes results of Schulz and Mittal on quasi-concave problems and Shao and Ehrgott on multiplicative linear programs. Furthermore, it improves results of L\"ohne and Wagner on minimizing the difference f=g−hf=g-h of two convex functions gg, hh where either gg or hh is polyhedral. Numerical examples are given and the results are compared with the global optimization software BARON.Comment: same content like journal version; difference to previous version: some typos in the text correcte

    Polyhedral aspects of Submodularity, Convexity and Concavity

    Full text link
    Seminal work by Edmonds and Lovasz shows the strong connection between submodularity and convexity. Submodular functions have tight modular lower bounds, and subdifferentials in a manner akin to convex functions. They also admit poly-time algorithms for minimization and satisfy the Fenchel duality theorem and the Discrete Seperation Theorem, both of which are fundamental characteristics of convex functions. Submodular functions also show signs similar to concavity. Submodular maximization, though NP hard, admits constant factor approximation guarantees. Concave functions composed with modular functions are submodular, and they also satisfy diminishing returns property. This manuscript provides a more complete picture on the relationship between submodularity with convexity and concavity, by extending many of the results connecting submodularity with convexity to the concave aspects of submodularity. We first show the existence of superdifferentials, and efficiently computable tight modular upper bounds of a submodular function. While we show that it is hard to characterize this polyhedron, we obtain inner and outer bounds on the superdifferential along with certain specific and useful supergradients. We then investigate forms of concave extensions of submodular functions and show interesting relationships to submodular maximization. We next show connections between optimality conditions over the superdifferentials and submodular maximization, and show how forms of approximate optimality conditions translate into approximation factors for maximization. We end this paper by studying versions of the discrete seperation theorem and the Fenchel duality theorem when seen from the concave point of view. In every case, we relate our results to the existing results from the convex point of view, thereby improving the analysis of the relationship between submodularity, convexity, and concavity.Comment: 38 pages, 10 figure

    Variational Analysis of Composite Models with Applications to Continuous Optimization

    Full text link
    The paper is devoted to a comprehensive study of composite models in variational analysis and optimization the importance of which for numerous theoretical, algorithmic, and applied issues of operations research is difficult to overstate. The underlying theme of our study is a systematical replacement of conventional metric regularity and related requirements by much weaker metric subregulatity ones that lead us to significantly stronger and completely new results of first-order and second-order variational analysis and optimization. In this way we develop extended calculus rules for first-order and second-order generalized differential constructions with paying the main attention in second-order variational theory to the new and rather large class of fully subamenable compositions. Applications to optimization include deriving enhanced no-gap second order optimality conditions in constrained composite models, complete characterizations of the uniqueness of Lagrange multipliers and strong metric subregularity of KKT systems in parametric optimization, etc

    Linearized M-stationarity conditions for general optimization problems

    Full text link
    This paper investigates new first-order optimality conditions for general optimization problems. These optimality conditions are stronger than the commonly used M-stationarity conditions and are in particular useful when the latter cannot be applied because the underlying limiting normal cone cannot be computed effectively. We apply our optimality conditions to a MPEC to demonstrate their practicability
    • …
    corecore