3 research outputs found
Primal and Dual Approximation Algorithms for Convex Vector Optimization Problems
Two approximation algorithms for solving convex vector optimization problems
(CVOPs) are provided. Both algorithms solve the CVOP and its geometric dual
problem simultaneously. The first algorithm is an extension of Benson's outer
approximation algorithm, and the second one is a dual variant of it. Both
algorithms provide an inner as well as an outer approximation of the (upper and
lower) images. Only one scalar convex program has to be solved in each
iteration. We allow objective and constraint functions that are not necessarily
differentiable, allow solid pointed polyhedral ordering cones, and relate the
approximations to an appropriate \epsilon-solution concept. Numerical examples
are provided
A Parametric Simplex Algorithm for Linear Vector Optimization Problems
In this paper, a parametric simplex algorithm for solving linear vector
optimization problems (LVOPs) is presented. This algorithm can be seen as a
variant of the multi-objective simplex (Evans-Steuer) algorithm [12]. Different
from it, the proposed algorithm works in the parameter space and does not aim
to find the set of all efficient solutions. Instead, it finds a solution in the
sense of Loehne [16], that is, it finds a subset of efficient solutions that
allows to generate the whole frontier. In that sense, it can also be seen as a
generalization of the parametric self-dual simplex algorithm, which originally
is designed for solving single objective linear optimization problems, and is
modified to solve two objective bounded LVOPs with the positive orthant as the
ordering cone in Ruszczynski and Vanderbei [21]. The algorithm proposed here
works for any dimension, any solid pointed polyhedral ordering cone C and for
bounded as well as unbounded problems. Numerical results are provided to
compare the proposed algorithm with an objective space based LVOP algorithm
(Benson algorithm in [13]), that also provides a solution in the sense of [16],
and with Evans-Steuer algorithm [12]. The results show that for non-degenerate
problems the proposed algorithm outperforms Benson algorithm and is on par with
Evan-Steuer algorithm. For highly degenerate problems Benson's algorithm [13]
excels the simplex-type algorithms; however, the parametric simplex algorithm
is for these problems computationally much more efficient than Evans-Steuer
algorithm.Comment: 27 pages, 4 figures, 5 table
Efficiently Constructing Convex Approximation Sets in Multiobjective Optimization Problems
Convex approximation sets for multiobjective optimization problems are a
well-studied relaxation of the common notion of approximation sets. Instead of
approximating each image of a feasible solution by the image of some solution
in the approximation set up to a multiplicative factor in each component, a
convex approximation set only requires this multiplicative approximation to be
achieved by some convex combination of finitely many images of solutions in the
set. This makes convex approximation sets efficiently computable for a wide
range of multiobjective problems - even for many problems for which (classic)
approximations sets are hard to compute.
In this article, we propose a polynomial-time algorithm to compute convex
approximation sets that builds upon an exact or approximate algorithm for the
weighted sum scalarization and is, therefore, applicable to a large variety of
multiobjective optimization problems. The provided convex approximation quality
is arbitrarily close to the approximation quality of the underlying algorithm
for the weighted sum scalarization. In essence, our algorithm can be
interpreted as an approximate variant of the dual variant of Benson's Outer
Approximation Algorithm. Thus, in contrast to existing convex approximation
algorithms from the literature, information on solutions obtained during the
approximation process is utilized to significantly reduce both the practical
running time and the cardinality of the returned solution sets while still
guaranteeing the same worst-case approximation quality. We underpin these
advantages by the first comparison of all existing convex approximation
algorithms on several instances of the triobjective knapsack problem and the
triobjective symmetric metric traveling salesman problem