33,521 research outputs found

    First order optimality conditions in set-valued optimization

    Get PDF
    A a set-valued optimization problem minC F(x), x 2 X0, is considered, where X0 X, X and Y are Banach spaces, F : X0 Y is a set-valued function and C Y is a closed cone. The solutions of the set-valued problem are defined as pairs (x0, y0), y0 2 F(x0), and are called minimizers. In particular the notions of w-minimizer (weakly efficient points), p-minimizer (properly efficient points) and i-minimizer (isolated minimizers) are introduced and their characterization in terms of the so called oriented distance is given. The relation between p-minimizers and i-minimizers under Lipschitz type conditions is investigated. The main purpose of the paper is to derive first order conditions, that is conditions in terms of suitable first order derivatives of F, for a pair (x0, y0), where x0 2 X0, y0 2 F(x0), to be a solution of this problem. We define and apply for this purpose the directional Dini derivative. Necessary conditions and sufficient conditions a pair (x0, y0) to be a w-minimizer, and similarly to be a i-minimizer are obtained. The role of the i-minimizers, which seems to be a new concept in set-valued optimization, is underlined. For the case of w-minimizers some comparison with existing results is done. Key words: Vector optimization, Set-valued optimization, First-order optimality conditions.

    First order optimality condition for constrained set-valued optimization

    Get PDF
    A constrained optimization problem with set-valued data is considered. Different kind of solutions are defined for such a problem. We recall weak minimizer, efficient minimizer and proper minimizer. The latter are defined in a way that embrace also the case when the ordering cone is not pointed. Moreover we present the new concept of isolated minimizer for set-valued optimization. These notions are investigated and appear when establishing first-order necessary and sufficient optimality conditions derived in terms of a Dini type derivative for set-valued maps. The case of convex (along rays) data is considered when studying sufficient optimality conditions for weak minimizers. Key words: Vector optimization, Set-valued optimization, First-order optimality conditions.

    On constrained set-valued optimization

    Get PDF
    The set-valued optimization problem minC F(x), G(x)\(-K) 6= ; is considered, where F : Rn Rm and G : Rn Rp are set-valued functions, and C Rm and K Rp are closed convex cones. Two type of solutions, called w-minimizers (weakly efficient points) and i-minimizers (isolated minimizers), are treated. In terms of the Dini set-valued directional derivative first-order necessary conditions for a point to be a w-minimizer, and first-order sufficient conditions for a point to be an i-minimizer are established, both in primal and dual form. Key words: Set-valued optimization, First-order optimality conditions, Dini derivatives.

    Optimality conditions for scalar and vector optimization problems with quasiconvex inequality constraints

    Get PDF
    Let X be a real linear space, X0 X a convex set, Y and Z topological real linear spaces. The constrained optimization problem minCf(x), g(x) 2 -K is considered, where f : X0 ! Y and g : X0 ! Z are given (nonsmooth) functions, and C Y and K Z are closed convex cones. The weakly efficient solutions (w-minimizers) of this problem are investigated. When g obeys quasiconvex properties, first-order necessary and first-order sufficient optimality conditions in terms of Dini directional derivatives are obtained. In the special case of problems with pseudoconvex data it is shown that these conditions characterize the global w-minimizers and generalize known results from convex vector programming. The obtained results are applied to the special case of problems with finite dimensional image spaces and ordering cones the positive orthants, in particular to scalar problems with quasiconvex constraints. It is shown, that the quasiconvexity of the constraints allows to formulate the optimality conditions using the more simple single valued Dini derivatives instead of the set valued ones. Key words: Vector optimization, nonsmooth optimization, quasiconvex vector functions, pseudoconvex vector functions, Dini derivatives, quasiconvex programming, Kuhn-Tucker conditions..

    A steepest descent method for set optimization problems with set-valued mappings of finite cardinality

    Get PDF
    In this paper, we study a first-order solution method for a particular class of set optimization problems where the solution concept is given by the set approach. We consider the case in which the set-valued objective mapping is identified by a finite number of continuously differentiable selections. The corresponding set optimization problem is then equivalent to find optimistic solutions to vector optimization problems under uncertainty with a finite uncertainty set. We develop optimality conditions for these types of problems and introduce two concepts of critical points. Furthermore, we propose a descent method and provide a convergence result to points satisfying the optimality conditions previously derived. Some numerical examples illustrating the performance of the method are also discussed. This paper is a modified and polished version of Chapter 5 in the dissertation by Quintana (On set optimization with set relations: a scalarization approach to optimality conditions and algorithms, Martin-Luther-Universität Halle-Wittenberg, 2020)

    Necessary Conditions in Multiobjective Optimization With Equilibrium Constraints

    Get PDF
    In this paper we study multiobjective optimization problems with equilibrium constraints (MOECs) described by generalized equations in the form 0 is an element of the set G(x,y) + Q(x,y), where both mappings G and Q are set-valued. Such models particularly arise from certain optimization-related problems governed by variational inequalities and first-order optimality conditions in nondifferentiable programming. We establish verifiable necessary conditions for the general problems under consideration and for their important specifications using modern tools of variational analysis and generalized differentiation. The application of the obtained necessary optimality conditions is illustrated by a numerical example from bilevel programming with convex while nondifferentiable data

    Introduction to Nonsmooth Analysis and Optimization

    Full text link
    This book aims to give an introduction to generalized derivative concepts useful in deriving necessary optimality conditions and numerical algorithms for infinite-dimensional nondifferentiable optimization problems that arise in inverse problems, imaging, and PDE-constrained optimization. They cover convex subdifferentials, Fenchel duality, monotone operators and resolvents, Moreau--Yosida regularization as well as Clarke and (briefly) limiting subdifferentials. Both first-order (proximal point and splitting) methods and second-order (semismooth Newton) methods are treated. In addition, differentiation of set-valued mapping is discussed and used for deriving second-order optimality conditions for as well as Lipschitz stability properties of minimizers. The required background from functional analysis and calculus of variations is also briefly summarized.Comment: arXiv admin note: substantial text overlap with arXiv:1708.0418
    corecore