33 research outputs found
New duality results for evenly convex optimization problems
We present new results on optimization problems where the involved functions are evenly convex. By means of a generalized conjugation scheme and the perturbation theory introduced by Rockafellar, we propose an alternative dual problem for a general optimization one defined on a separated locally convex topological space. Sufficient conditions for converse and total duality involving the even convexity of the perturbation function and c-subdifferentials are given. Formulae for the c-subdifferential and biconjugate of the objective function of a general optimization problem are provided, too. We also characterize the total duality by means of the saddle-point theory for a notion of Lagrangian adapted to the considered framework.Research partially supported by MINECO of Spain and ERDF of EU, Grant MTM2014-59179-C2-1-P, Austrian Science Fund (FWF), Project M-2045, and German Research Foundation (DFG), Project GR3367/4-1
Closedness type regularity conditions for surjectivity results involving the sum of two maximal monotone operators
In this note we provide regularity conditions of closedness type which
guarantee some surjectivity results concerning the sum of two maximal monotone
operators by using representative functions. The first regularity condition we
give guarantees the surjectivity of the monotone operator , where and and are maximal monotone operators on
the reflexive Banach space . Then, this is used to obtain sufficient
conditions for the surjectivity of and for the situation when belongs
to the range of . Several special cases are discussed, some of them
delivering interesting byproducts.Comment: 11 pages, no figure
A forward-backward method for solving vector optimization problems
We present an iterative proximal inertial forward-backward method with memory effects, based on recent advances in solving scalar convex optimization problems and monotone inclusions, for determining weakly efficient solutions to convex vector optimization problems consisting in vector-minimizing the sum of a differentiable vector function with a nonsmooth one, by making use of some adaptive linear scalarization techniques. During the talk, the difficulties encountered while formulating the algorithm and proving its convergence will be stressed, while the related (still unsolved) challenge of extending the celebrated FISTA method from scalar to vector optimization problems will be mentioned, too. The talk is based on joint work with Radu Ioan Boț.Non UBCUnreviewedAuthor affiliation: Chemnitz University of TechnologyPostdoctora
New insights into conjugate duality
With this thesis we bring some new results and improve some
existing ones in conjugate duality and some of the areas it is
applied in.
First we recall the way Lagrange, Fenchel and Fenchel - Lagrange
dual problems to a given primal optimization problem can be
obtained via perturbations and we present some connections between
them. For the Fenchel - Lagrange dual problem we prove strong
duality under more general conditions than known so far, while for
the Fenchel duality we show that the convexity assumptions on the
functions involved can be weakened without altering the
conclusion. In order to prove the latter we prove also that some
formulae concerning conjugate functions given so far only for
convex functions hold also for almost convex, respectively nearly
convex functions.
After proving that the generalized geometric dual problem can be
obtained via perturbations, we show that the geometric duality is
a special case of the Fenchel - Lagrange duality and the strong
duality can be obtained under weaker conditions than stated in the
existing literature. For various problems treated in the
literature via geometric duality we show that Fenchel - Lagrange
duality is easier to apply, bringing moreover strong duality and
optimality conditions under weaker assumptions.
The results presented so far are applied also in convex composite
optimization and entropy optimization. For the composed convex
cone - constrained optimization problem we give strong duality and
the related optimality conditions, then we apply these when
showing that the formula of the conjugate of the precomposition
with a proper convex K - increasing function of a K - convex
function on some n - dimensional non - empty convex set X, where
K is a k - dimensional non - empty closed convex cone, holds under
weaker conditions than known so far. Another field were we apply
these results is vector optimization, where we provide a general
duality framework based on a more general scalarization that
includes as special cases and improves some previous results in
the literature. Concerning entropy optimization, we treat first
via duality a problem having an entropy - like objective function,
from which arise as special cases some problems found in the
literature on entropy optimization. Finally, an application of
entropy optimization into text classification is presented
New insights into conjugate duality
With this thesis we bring some new results and improve some
existing ones in conjugate duality and some of the areas it is
applied in.
First we recall the way Lagrange, Fenchel and Fenchel - Lagrange
dual problems to a given primal optimization problem can be
obtained via perturbations and we present some connections between
them. For the Fenchel - Lagrange dual problem we prove strong
duality under more general conditions than known so far, while for
the Fenchel duality we show that the convexity assumptions on the
functions involved can be weakened without altering the
conclusion. In order to prove the latter we prove also that some
formulae concerning conjugate functions given so far only for
convex functions hold also for almost convex, respectively nearly
convex functions.
After proving that the generalized geometric dual problem can be
obtained via perturbations, we show that the geometric duality is
a special case of the Fenchel - Lagrange duality and the strong
duality can be obtained under weaker conditions than stated in the
existing literature. For various problems treated in the
literature via geometric duality we show that Fenchel - Lagrange
duality is easier to apply, bringing moreover strong duality and
optimality conditions under weaker assumptions.
The results presented so far are applied also in convex composite
optimization and entropy optimization. For the composed convex
cone - constrained optimization problem we give strong duality and
the related optimality conditions, then we apply these when
showing that the formula of the conjugate of the precomposition
with a proper convex K - increasing function of a K - convex
function on some n - dimensional non - empty convex set X, where
K is a k - dimensional non - empty closed convex cone, holds under
weaker conditions than known so far. Another field were we apply
these results is vector optimization, where we provide a general
duality framework based on a more general scalarization that
includes as special cases and improves some previous results in
the literature. Concerning entropy optimization, we treat first
via duality a problem having an entropy - like objective function,
from which arise as special cases some problems found in the
literature on entropy optimization. Finally, an application of
entropy optimization into text classification is presented
Vector optimization and monotone operators via convex duality: recent advances
This book investigates several duality approaches for vector optimization problems, while also comparing them. Special attention is paid to duality for linear vector optimization problems, for which a vector dual that avoids the shortcomings of the classical ones is proposed. Moreover, the book addresses different efficiency concepts for vector optimization problems. Among the problems that appear when the framework is generalized by considering set-valued functions, an increasing interest is generated by those involving monotone operators, especially now that new methods for approaching them by means of convex analysis have been developed. Following this path, the book provides several results on different properties of sums of monotone operators
Closedness type regularity conditions in convex optimization and beyond
The closedness type regularity conditions have proven during the last decade to be viable alternatives to their more restrictive interiority type counterparts, in both convex optimization and different areas where it was successfully applied. In this review article we de- and reconstruct some closedness type regularity conditions formulated by means of epigraphs and subdifferentials, respectively, for general optimization problems in order to stress that they arise naturally when dealing with such problems. The results are then specialized for constrained and unconstrained convex optimization problems. We also hint towards other classes of optimization problems where closedness type regularity conditions were successfully employed and discuss other possible applications of them