9 research outputs found
Large-scale optimization with the primal-dual column generation method
The primal-dual column generation method (PDCGM) is a general-purpose column
generation technique that relies on the primal-dual interior point method to
solve the restricted master problems. The use of this interior point method
variant allows to obtain suboptimal and well-centered dual solutions which
naturally stabilizes the column generation. As recently presented in the
literature, reductions in the number of calls to the oracle and in the CPU
times are typically observed when compared to the standard column generation,
which relies on extreme optimal dual solutions. However, these results are
based on relatively small problems obtained from linear relaxations of
combinatorial applications. In this paper, we investigate the behaviour of the
PDCGM in a broader context, namely when solving large-scale convex optimization
problems. We have selected applications that arise in important real-life
contexts such as data analysis (multiple kernel learning problem),
decision-making under uncertainty (two-stage stochastic programming problems)
and telecommunication and transportation networks (multicommodity network flow
problem). In the numerical experiments, we use publicly available benchmark
instances to compare the performance of the PDCGM against recent results for
different methods presented in the literature, which were the best available
results to date. The analysis of these results suggests that the PDCGM offers
an attractive alternative over specialized methods since it remains competitive
in terms of number of iterations and CPU times even for large-scale
optimization problems.Comment: 28 pages, 1 figure, minor revision, scaled CPU time
Mixed-Integer Linear Optimization: PrimalâDual Relations and Dual Subgradient and Cutting-Plane Methods
This chapter presents several solution methodologies for mixed-integer linear optimization, stated as mixed-binary optimization problems, by means of Lagrangian duals, subgradient optimization, cutting-planes, and recovery of primal solutions. It covers Lagrangian duality theory for mixed-binary linear optimization, a problem framework for which ultimate successâin most casesâis hard to accomplish, since strong duality cannot be inferred. First, a simple conditional subgradient optimization method for solving the dual problem is presented. Then, we show how ergodic sequences of Lagrangian subproblem solutions can be computed and used to recover mixed-binary primal solutions. We establish that the ergodic sequences accumulate at solutions to a convexified version of the original mixed-binary optimization problem. We also present a cutting-plane approach to the Lagrangian dual, which amounts to solving the convexified problem by DantzigâWolfe decomposition, as well as a two-phase method that benefits from the advantages of both subgradient optimization and DantzigâWolfe decomposition. Finally, we describe how the Lagrangian dual approach can be used to find near optimal solutions to mixed-binary optimization problems by utilizing the ergodic sequences in a Lagrangian heuristic, to construct a core problem, as well as to guide the branching in a branch-and-bound method. The chapter is concluded with a section comprising notes, references, historical downturns, and reading tips
Standard Bundle Methods: Untrusted Models and Duality
We review the basic ideas underlying the vast family of algorithms for nonsmooth convex optimization known as "bundle methods". In a nutshell, these approaches are based on constructing models of the function, but lack of continuity of first-order information implies that these models cannot be trusted, not even close to an optimum. Therefore, many different forms of stabilization have been proposed to try to avoid being led to areas where the model is so inaccurate as to result in almost useless steps. In the development of these methods, duality arguments are useful, if not outright necessary, to better analyze the behaviour of the algorithms. Also, in many relevant applications the function at hand is itself a dual one, so that duality allows to map back algorithmic concepts and results into a "primal space" where they can be exploited; in turn, structure in that space can be exploited to improve the algorithms' behaviour, e.g. by developing better models. We present an updated picture of the many developments around the basic idea along at least three different axes: form of the stabilization, form of the model, and approximate evaluation of the function