97 research outputs found
Cosolver2B: An Efficient Local Search Heuristic for the Travelling Thief Problem
Real-world problems are very difficult to optimize. However, many researchers
have been solving benchmark problems that have been extensively investigated
for the last decades even if they have very few direct applications. The
Traveling Thief Problem (TTP) is a NP-hard optimization problem that aims to
provide a more realistic model. TTP targets particularly routing problem under
packing/loading constraints which can be found in supply chain management and
transportation. In this paper, TTP is presented and formulated mathematically.
A combined local search algorithm is proposed and compared with Random Local
Search (RLS) and Evolutionary Algorithm (EA). The obtained results are quite
promising since new better solutions were found.Comment: 12th ACS/IEEE International Conference on Computer Systems and
Applications (AICCSA) 2015. November 17-20, 201
Look-ahead with mini-bucket heuristics for MPE
The paper investigates the potential of look-ahead in the con-text of AND/OR search in graphical models using the Mini-Bucket heuristic for combinatorial optimization tasks (e.g., MAP/MPE or weighted CSPs). We present and analyze the complexity of computing the residual (a.k.a Bellman update) of the Mini-Bucket heuristic and show how this can be used to identify which parts of the search space are more likely to benefit from look-ahead and how to bound its overhead. We also rephrase the look-ahead computation as a graphical model, to facilitate structure exploiting inference schemes. We demonstrate empirically that augmenting Mini-Bucket heuristics by look-ahead is a cost-effective way of increasing the power of Branch-And-Bound search.Postprint (published version
A New Framework for Network Disruption
Traditional network disruption approaches focus on disconnecting or
lengthening paths in the network. We present a new framework for network
disruption that attempts to reroute flow through critical vertices via vertex
deletion, under the assumption that this will render those vertices vulnerable
to future attacks. We define the load on a critical vertex to be the number of
paths in the network that must flow through the vertex. We present
graph-theoretic and computational techniques to maximize this load, firstly by
removing either a single vertex from the network, secondly by removing a subset
of vertices.Comment: Submitted for peer review on September 13, 201
Self-adaptive randomized constructive heuristics for the multi-item capacitated lot-sizing problem
Capacitated lot-sizing problems (CLSPs) are important and challenging
optimization problems in production planning. Amongst the many approaches
developed for CLSPs, constructive heuristics are known to be the most intuitive
and fastest method for finding good feasible solutions for the CLSPs, and
therefore are often used as a subroutine in building more sophisticated exact
and metaheuristic approaches. Classical constructive heuristics, such as the
period-by-period heuristics and lot elimination heuristics, are first
introduced in the 1990s, and thereafter widely used in solving the CLSPs. This
paper evaluates the performance of period-by-period and lot elimination
heuristics, and improves the heuristics using perturbation techniques and
self-adaptive methods. We have also proposed a procedure for automatically
adjusting the parameters of the proposed heuristics so that the values of the
parameters can be chosen based on features of individual instances.
Experimental results show that the proposed self-adaptive randomized
period-by-period constructive heuristics are efficient and can find better
solutions with less computational time than the tabu search and lot elimination
heuristics. When the proposed constructive heuristic is used in a basic tabu
search framework, high-quality solutions with 0.88% average optimality gap can
be obtained on benchmark instances of 12 periods and 12 items, and optimality
gap within 1.2% for the instances with 24 periods and 24 items
A New Dantzig-Wolfe Reformulation And Branch-And-Price Algorithm For The Capacitated Lot Sizing Problem With Set Up Times
The textbook Dantzig-Wolfe decomposition for the Capacitated LotSizing Problem (CLSP),as already proposed by Manne in 1958, has animportant structural deficiency. Imposingintegrality constraints onthe variables in the full blown master will not necessarily givetheoptimal IP solution as only production plans which satisfy theWagner-Whitin condition canbe selected. It is well known that theoptimal solution to a capacitated lot sizing problem willnotnecessarily have this Wagner-Whitin property. The columns of thetraditionaldecomposition model include both the integer set up andcontinuous production quantitydecisions. Choosing a specific set upschedule implies also taking the associated Wagner-Whitin productionquantities. We propose the correct Dantzig-Wolfedecompositionreformulation separating the set up and productiondecisions. This formulation gives the samelower bound as Manne'sreformulation and allows for branch-and-price. We use theCapacitatedLot Sizing Problem with Set Up Times to illustrate our approach.Computationalexperiments are presented on data sets available from theliterature. Column generation isspeeded up by a combination of simplexand subgradient optimization for finding the dualprices. The resultsshow that branch-and-price is computationally tractable andcompetitivewith other approaches. Finally, we briefly discuss how thisnew Dantzig-Wolfe reformulationcan be generalized to other mixedinteger programming problems, whereas in theliterature,branch-and-price algorithms are almost exclusivelydeveloped for pure integer programmingproblems.branch-and-price;Lagrange relaxation;Dantzig-Wolfe decomposition;lot sizing;mixed-integer programming
Limited discrepancy AND/OR search and its application to optimization tasks in graphical models
Many combinatorial problems are solved with a Depth-First search (DFS) guided by a heuristic and it is well-known that this method is very fragile with respect to heuristic mistakes. One standard way to make DFS more robust is to search by increasing number of discrepancies. This approach has been found useful in several domains where the search structure is a height-bounded OR tree. In this paper we investigate the generalization of discrepancy-based search to AND/OR search trees and propose an extension of the Limited Discrepancy Search (LDS) algorithm. We demonstrate the relevance of our proposal in the context of Graphical Models. In these problems, which can be solved with either a standard OR search tree or an AND/OR tree, we show the superiority of our approach. For a fixed number of discrepancies, the search space visited by the AND/OR algorithm strictly contains the search space visited by standard LDS, and many more nodes can be visited due to the multiplicative effect of the AND/OR decomposition. Besides, if the AND/OR tree achieves a significant size reduction with respect to the standard OR tree, the cost of each iteration of the AND/OR algorithm is asymptotically lower than in standard LDS. We report experiments on the minsum problem on different domains and show that the AND/OR version of LDS usually obtains better solutions given the same CPU time.Peer ReviewedPostprint (published version
- …