483 research outputs found
Relaxation Adaptive Memory Programming For The Resource Constrained Project Scheduling Problem
The resource constrained project scheduling problem (RCPSP) is one of the most intractable problems in operations research; it is NP-hard in the strong sense. Due to the hardness of the problem, exact solution methods can only tackle instances of relatively small size. For larger instances commonly found in real applications heuristic solution methods are necessary to find near-optimal solutions within acceptable computation time limits. In this study algorithms based on the relaxation adaptive memory programming (RAMP) method (Rego, 2005) are developed for the purpose of solving the RCPSP. The RAMP algorithms developed here combine mathematical relaxation, including Lagrangian relaxation and surrogate constraint relaxation, with tabu search and genetic algorithms. Computational tests are performed on an extensive set of benchmark instances. The results demonstrate the capability of the proposed approaches to the solution of RCPSPs of different sizes and characteristics and provide meaningful insights to the potential application of these approaches to other more complex resource-constrained scheduling problems
Playing with Duality: An Overview of Recent Primal-Dual Approaches for Solving Large-Scale Optimization Problems
Optimization methods are at the core of many problems in signal/image
processing, computer vision, and machine learning. For a long time, it has been
recognized that looking at the dual of an optimization problem may drastically
simplify its solution. Deriving efficient strategies which jointly brings into
play the primal and the dual problems is however a more recent idea which has
generated many important new contributions in the last years. These novel
developments are grounded on recent advances in convex analysis, discrete
optimization, parallel processing, and non-smooth optimization with emphasis on
sparsity issues. In this paper, we aim at presenting the principles of
primal-dual approaches, while giving an overview of numerical methods which
have been proposed in different contexts. We show the benefits which can be
drawn from primal-dual algorithms both for solving large-scale convex
optimization problems and discrete ones, and we provide various application
examples to illustrate their usefulness
Conic Optimization Theory: Convexification Techniques and Numerical Algorithms
Optimization is at the core of control theory and appears in several areas of
this field, such as optimal control, distributed control, system
identification, robust control, state estimation, model predictive control and
dynamic programming. The recent advances in various topics of modern
optimization have also been revamping the area of machine learning. Motivated
by the crucial role of optimization theory in the design, analysis, control and
operation of real-world systems, this tutorial paper offers a detailed overview
of some major advances in this area, namely conic optimization and its emerging
applications. First, we discuss the importance of conic optimization in
different areas. Then, we explain seminal results on the design of hierarchies
of convex relaxations for a wide range of nonconvex problems. Finally, we study
different numerical algorithms for large-scale conic optimization problems.Comment: 18 page
On generalized surrogate duality in mixed-integer nonlinear programming
The most important ingredient for solving mixed-integer nonlinear programs (MINLPs) to global -optimality with spatial branch and bound is a tight, computationally
tractable relaxation. Due to both theoretical and practical considerations, relaxations of MINLPs are usually required to be convex. Nonetheless, current optimization solvers
can often successfully handle a moderate presence of nonconvexities, which opens the door for the use of potentially tighter nonconvex relaxations. In this work, we
exploit this fact and make use of a nonconvex relaxation obtained via aggregation of constraints: a surrogate relaxation. These relaxations were actively studied for linear integer programs in the 70s and 80s, but they have been scarcely considered since. We revisit these relaxations in an MINLP setting and show the computational benefits and
challenges they can have. Additionally, we study a generalization of such relaxation that allows for multiple aggregations simultaneously and present the first algorithm that is capable of computing the best set of aggregations. We propose a multitude of computational enhancements for improving its practical performance and evaluate the
algorithm’s ability to generate strong dual bounds through extensive computational experiments
- …