24 research outputs found

    Optimality conditions in convex multiobjective SIP

    Get PDF
    The purpose of this paper is to characterize the weak efficient solutions, the efficient solutions, and the isolated efficient solutions of a given vector optimization problem with finitely many convex objective functions and infinitely many convex constraints. To do this, we introduce new and already known data qualifications (conditions involving the constraints and/or the objectives) in order to get optimality conditions which are expressed in terms of either Karusk–Kuhn–Tucker multipliers or a new gap function associated with the given problem.This research was partially cosponsored by the Ministry of Economy and Competitiveness (MINECO) of Spain, and by the European Regional Development Fund (ERDF) of the European Commission, Project MTM2014-59179-C2-1-P

    Sufficient conditions of optimality for multiobjective optimization problems with γ-paraconvex data

    No full text
    We study multiobjective optimization problems with γ-paraconvex multifunction data. Sufficient optimality conditions for unconstrained and constrained problems are given in terms of contingent derivatives

    Mosco-convergence de suites de fonctions DC et conditions d'optimalité du second ordre

    No full text

    From scalar to vector optimization

    Get PDF
    summary:Initially, second-order necessary optimality conditions and sufficient optimality conditions in terms of Hadamard type derivatives for the unconstrained scalar optimization problem ϕ(x)min\phi (x)\rightarrow \min , xRmx\in \mathbb{R}^m, are given. These conditions work with arbitrary functions ϕRmR\phi \:\mathbb{R}^m \rightarrow \overline{\mathbb{R}}, but they show inconsistency with the classical derivatives. This is a base to pose the question whether the formulated optimality conditions remain true when the “inconsistent” Hadamard derivatives are replaced with the “consistent” Dini derivatives. It is shown that the answer is affirmative if ϕ\phi is of class C1,1{\mathcal C}^{1,1} (i.e., differentiable with locally Lipschitz derivative). Further, considering C1,1{\mathcal C}^{1,1} functions, the discussion is raised to unconstrained vector optimization problems. Using the so called “oriented distance” from a point to a set, we generalize to an arbitrary ordering cone some second-order necessary conditions and sufficient conditions given by Liu, Neittaanmäki, Křížek for a polyhedral cone. Furthermore, we show that the conditions obtained are sufficient not only for efficiency but also for strict efficiency

    Characterization of solutions of multiobjective optimization problem

    No full text
    A characterization of weakly efficient, efficient and properly efficient solutions of multiobjective optimization problems is given in terms of a scalar optimization problem by using a special "distance" function. The concept of the well-posedness for this special scalar problem is then linked with the properly efficient solutions of the multiobjective problem

    Second order optimality conditions for bilevel set optimization problems

    No full text
    Approximate Jacobian, Approximate Hessian, Recession matrices, Bilevel optimization, Second order approximation, Necessary optimality conditions, Regularity condition, Set valued mappings, Support function, 90C29, 49J52, 90C30, 49K99,
    corecore