57 research outputs found

    Higher-order conditions for strict efficiency revisited

    Get PDF
    D. V. Luu and P. T. Kien propose in Soochow J. Math. 33 (2007), 17-31, higher- order conditions for strict efficiency of vector optimization problems based on the derivatives introduced by I. Ginchev in Optimization 51 (2002), 47-72. These derivatives are defined for scalar functions and in their terms necessary and sufficient conditions can be obtained a point to be strictly efficient (isolated) minimizer of a given order for quite arbitrary scalar function. Passing to vector functions, Luu and Kien lose the peculiarity that the optimality conditions work with arbitrary functions. In the present paper, applying the mentioned derivatives for the scalarized problem and restoring the original idea, optimality conditions for strictly efficiency of a given order are proposed, which work with quite arbitrary vector functions. It is shown that the results of Luu and Kien are corollaries of the given conditions. Key words: nonsmooth vector optimization, higher-order optimality conditions, strict efficiency, isolated minimizers.

    Weakened subdifferentials and Frechet differentiability of real functions

    Get PDF
    Let X be a real Banach space and f : X ! R [ {+1}. It is well known that the Clarke subdifferential @ f(x) of the function f at x 2 int dom f is a singleton if and only if f is strongly differentiable (then @ f(x) = {Dsf(x)}, where Dsf(x) is the strong subdifferential of f at x). Simple examples show that there exist FrĀ“echet differentiable at x functions f, for which @ f(x) is not a singleton. In such a sense the Clarke subdifferential is not an exact generalization of the differential of a differentiable function. In the present paper we propose a new subdifferential @w f(x), called the weakened subdifferential of f at x, which preserves the nice calculus rules of the Clarke subdifferential, and for X finite dimensional, is a singleton @w f(x) = {} if and only if f is FrĀ“echet differentiable at x, and then = DF f(x).generalized subdifferentials.

    Optimality conditions for scalar and vector optimization problems with quasiconvex inequality constraints

    Get PDF
    Let X be a real linear space, X0 X a convex set, Y and Z topological real linear spaces. The constrained optimization problem minCf(x), g(x) 2 -K is considered, where f : X0 ! Y and g : X0 ! Z are given (nonsmooth) functions, and C Y and K Z are closed convex cones. The weakly efficient solutions (w-minimizers) of this problem are investigated. When g obeys quasiconvex properties, first-order necessary and first-order sufficient optimality conditions in terms of Dini directional derivatives are obtained. In the special case of problems with pseudoconvex data it is shown that these conditions characterize the global w-minimizers and generalize known results from convex vector programming. The obtained results are applied to the special case of problems with finite dimensional image spaces and ordering cones the positive orthants, in particular to scalar problems with quasiconvex constraints. It is shown, that the quasiconvexity of the constraints allows to formulate the optimality conditions using the more simple single valued Dini derivatives instead of the set valued ones. Key words: Vector optimization, nonsmooth optimization, quasiconvex vector functions, pseudoconvex vector functions, Dini derivatives, quasiconvex programming, Kuhn-Tucker conditions..

    On constrained set-valued optimization

    Get PDF
    The set-valued optimization problem minC F(x), G(x)\(-K) 6= ; is considered, where F : Rn Rm and G : Rn Rp are set-valued functions, and C Rm and K Rp are closed convex cones. Two type of solutions, called w-minimizers (weakly efficient points) and i-minimizers (isolated minimizers), are treated. In terms of the Dini set-valued directional derivative first-order necessary conditions for a point to be a w-minimizer, and first-order sufficient conditions for a point to be an i-minimizer are established, both in primal and dual form. Key words: Set-valued optimization, First-order optimality conditions, Dini derivatives.

    Optimization problems with quasiconvex inequality constraints

    Get PDF
    The constrained optimization problem min f(x), gj(x) 0 (j = 1, . . . , p) is considered, where f : X ! R and gj : X ! R are nonsmooth functions with domain X Rn. First-order necessary and first-order sufficient optimality conditions are obtained when gj are quasiconvex functions. Two are the main features of the paper: to treat nonsmooth problems it makes use of the Dini derivative; to obtain more sensitive conditions, it admits directionally dependent multipliers. The two cases, where the Lagrange function satisfies a non-strict and a strict inequality, are considered. In the case of a non-strict inequality pseudoconvex functions are involved and in their terms some properties of the convex programming problems are generalized. The efficiency of the obtained conditions is illustrated on an example. Key words: Nonsmooth optimization, Dini directional derivatives, quasiconvex functions, pseudoconvex functions, quasiconvex programming, Kuhn-Tucker conditions.

    Isolated minimizers, proper efficiency and stability for C0,1 constrained vector optimization problems

    Get PDF
    In this paper we consider the vector optimization problem minC f(x), g(x) 2 -K, where f : Rn ! Rm and g : Rn Rp are C0,1 functions and C Rm and K Rp are closed convex cones. We give several notions of solutions (efficiency concepts), among them the notion of a properly efficient point (p-minimizer) of order k and the notion of an isolated minimizer of order k. We show that each isolated minimizer of order k > = 1 is a p-minimizer of order k. The possible reversal of this statement in the case k = 1 is the main subject of the investigation. For this study we apply some first order necessary and sufficient conditions in terms of Dini derivatives. We show that the given optimality conditions are important to solve the posed problem, and a satisfactory solution leads to two approaches toward efficiency concepts, called respectively sense I and sense II concepts. Relations between sense I and sense II isolated minimizers and p-minimizers are obtained. In particular, we are concerned in the stability properties of the p-minimizers and the isolated minimizers. By stability, we mean that they still remain the same type of solutions under small perturbations of the problem data. We show that the p-minimizers are stable under perturbations of the cones, while the isolated minimizers are stable under perturbations both of the cones and the functions in the data set. Further, we show that the sense I concepts are stable under perturbations of the objective data, while the sense II concepts are stable under perturbations both of the objective and the constraints.Vector optimization, Locally Lipschitz data, Properly efficient points, Isolated minimizers, Optimality conditions, Stability.

    First order optimality condition for constrained set-valued optimization

    Get PDF
    A constrained optimization problem with set-valued data is considered. Different kind of solutions are defined for such a problem. We recall weak minimizer, efficient minimizer and proper minimizer. The latter are defined in a way that embrace also the case when the ordering cone is not pointed. Moreover we present the new concept of isolated minimizer for set-valued optimization. These notions are investigated and appear when establishing first-order necessary and sufficient optimality conditions derived in terms of a Dini type derivative for set-valued maps. The case of convex (along rays) data is considered when studying sufficient optimality conditions for weak minimizers. Key words: Vector optimization, Set-valued optimization, First-order optimality conditions.

    First-Order Conditions for C0,1 Constrained vector optimization

    Get PDF
    For a Fritz John type vector optimization problem with C0,1 data we define different type of solutions, give their scalar characterizations applying the so called oriented distance, and give necessary and sufficient first order optimality conditions in terms of the Dini derivative. While establishing the sufficiency, we introduce new type of efficient points referred to as isolated minimizers of first order, and show their relation to properly efficient points. More precisely, the obtained necessary conditions are necessary for weakly efficiency, and the sufficient conditions are both sufficient and necessary for a point to be an isolated minimizer of first order.vector optimization, nonsmooth optimization, C0,1 functions, Dini derivatives, first-order optimality conditions, lagrange multipliers

    Minty variational inequalities, increase-along-rays property and optimization

    Get PDF
    Let E be a linear space, K E and f : K ? R. We put in terms of the lower Dini directional derivative a problem, referred to as GMV I(f ,K), which can be considered as a generalization of the Minty variational inequality of differential type (for short, MV I(f ,K)). We investigate, in the case of K star-shaped (for short, st-sh), the existence of a solution x of GMV I(f ,K) and the property of f to increase-along-rays starting at x (for short, f IAR(K, x )). We prove that GMV I(f ,K) with radially l.s.c. function f has a solution x ker K if and only if f IAR(K, x ). Further, we prove, that the solution set of GMV I(f ,K) is a convex and radially closed subset of kerK. We show also that, if GMV I(f ,K) has a solution x K, then x is a global minimizer of the problem f(x) ? min, x K. Moreover, we observe that the set of the global minimizers of the related optimization problem, its kernel, and the solution set of the variational inequality can be different. Finally, we prove, that in case of a quasi-convex function f, these sets coincide. Key words: Minty variational inequality, Generalized variational inequality, Existence of solutions, Increase along rays, Quasi-convex functions.

    Variational inequalities in vector optimization

    Get PDF
    In this paper we investigate the links among generalized scalar variational inequalities of differential type, vector variational inequalities and vector optimization problems. The considered scalar variational inequalities are obtained through a nonlinear scalarization by means of the so called ā€oriented distanceā€ function [14, 15]. In the case of Stampacchia-type variational inequalities, the solutions of the proposed ones coincide with the solutions of the vector variational inequalities introduced by Giannessi [8]. For Minty-type variational inequalities, analogous coincidence happens under convexity hypotheses. Furthermore, the considered variational inequalities reveal useful in filling a gap between scalar and vector variational inequalities. Namely, in the scalar case Minty variational inequalities of differential type represent a sufficient optimality condition without additional assumptions, while in the vector case the convexity hypothesis was needed. Moreover it is shown that vector functions admitting a solution of the proposed Minty variational inequality enjoy some well-posedness properties, analogously to the scalar case [4].
    • ā€¦
    corecore