29 research outputs found

    The AllDifferent Constraint with Precedences

    Full text link
    We propose ALLDIFFPREC, a new global constraint that combines together an ALLDIFFERENT constraint with precedence constraints that strictly order given pairs of variables. We identify a number of applications for this global constraint including instruction scheduling and symmetry breaking. We give an efficient propagation algorithm that enforces bounds consistency on this global constraint. We show how to implement this propagator using a decomposition that extends the bounds consistency enforcing decomposition proposed for the ALLDIFFERENT constraint. Finally, we prove that enforcing domain consistency on this global constraint is NP-hard in general

    Improving the Asymmetric TSP by Considering Graph Structure

    Get PDF
    Recent works on cost based relaxations have improved Constraint Programming (CP) models for the Traveling Salesman Problem (TSP). We provide a short survey over solving asymmetric TSP with CP. Then, we suggest new implied propagators based on general graph properties. We experimentally show that such implied propagators bring robustness to pathological instances and highlight the fact that graph structure can significantly improve search heuristics behavior. Finally, we show that our approach outperforms current state of the art results.Comment: Technical repor

    Linear-time filtering algorithms for the disjunctive constraint and a quadratic filtering algorithm for the cumulative not-first not-last

    Get PDF
    We present new filtering algorithms for Disjunctive and Cumulative constraints, each of which improves the complexity of the state-of-theart algorithms by a factor of log n. We show how to perform TimeTabling and Detectable Precedences in linear time on the Disjunctive constraint. Furthermore, we present a linear-time Overload Checking for the Disjunctive and Cumulative constraints. Finally, we show how the rule of Not-first/Not-last can be enforced in quadratic time for the Cumulative constraint. These algorithms rely on the union find data structure, from which we take advantage to introduce a new data structure that we call it time line. This data structure provides constant time operations that were previously implemented in logarithmic time by the Θ-tree data structure. Experiments show that these new algorithms are competitive even for a small number of tasks and outperform existing algorithms as the number of tasks increases. We also show that the time line can be used to solve specific scheduling problems

    Self-decomposable Global Constraints

    Get PDF
    International audienceScalability becomes more and more critical to decision support technologies. In order to address this issue in Constraint Programming, we introduce the family of self-decomposable constraints. These constraints can be satisfied by applying their own filtering algorithms on variable subsets only. We introduce a generic framework which dynamically decompose propagation, by filtering over variable subsets. Our experiments over the CUMULATIVE constraint illustrate the practical relevance of self-decomposition

    An Integer Programming Approach to Optimal Basic Block Instruction Scheduling for Single-Issue Processors

    Get PDF
    We present a novel integer programming formulation for basic block instruction scheduling on single-issue processors. The problem can be considered as a very general sequential task scheduling problem with delayed precedence-constraints. Our model is based on the linear ordering problem and has, in contrast to the last IP model proposed, numbers of variables and constraints that are strongly polynomial in the instance size. Combined with improved preprocessing techniques and given a time limit of ten minutes of CPU and system time, our branch-and-cut implementation is capable to solve all but eleven of the 369,861 basic blocks of the SPEC 2000 integer and floating point benchmarks to proven optimality. This is competitive to the current state-of-the art constraint programming approach that has also been evaluated on this test suite

    Efficient algorithms to solve scheduling problems with a variety of optimization criteria

    Get PDF
    La programmation par contraintes est une technique puissante pour résoudre, entre autres, des problèmes d'ordonnancement de grande envergure. L'ordonnancement vise à allouer dans le temps des tâches à des ressources. Lors de son exécution, une tâche consomme une ressource à un taux constant. Généralement, on cherche à optimiser une fonction objectif telle la durée totale d'un ordonnancement. Résoudre un problème d'ordonnancement signifie trouver quand chaque tâche doit débuter et quelle ressource doit l'exécuter. La plupart des problèmes d'ordonnancement sont NP-Difficiles. Conséquemment, il n'existe aucun algorithme connu capable de les résoudre en temps polynomial. Cependant, il existe des spécialisations aux problèmes d'ordonnancement qui ne sont pas NP-Complet. Ces problèmes peuvent être résolus en temps polynomial en utilisant des algorithmes qui leur sont propres. Notre objectif est d'explorer ces algorithmes d'ordonnancement dans plusieurs contextes variés. Les techniques de filtrage ont beaucoup évolué dans les dernières années en ordonnancement basé sur les contraintes. La proéminence des algorithmes de filtrage repose sur leur habilité à réduire l'arbre de recherche en excluant les valeurs des domaines qui ne participent pas à des solutions au problème. Nous proposons des améliorations et présentons des algorithmes de filtrage plus efficaces pour résoudre des problèmes classiques d'ordonnancement. De plus, nous présentons des adaptations de techniques de filtrage pour le cas où les tâches peuvent être retardées. Nous considérons aussi différentes propriétés de problèmes industriels et résolvons plus efficacement des problèmes où le critère d'optimisation n'est pas nécessairement le moment où la dernière tâche se termine. Par exemple, nous présentons des algorithmes à temps polynomial pour le cas où la quantité de ressources fluctue dans le temps, ou quand le coût d'exécuter une tâche au temps t dépend de t.Constraint programming is a powerful methodology to solve large scale and practical scheduling problems. Resource-constrained scheduling deals with temporal allocation of a variety of tasks to a set of resources, where the tasks consume a certain amount of resource during their execution. Ordinarily, a desired objective function such as the total length of a feasible schedule, called the makespan, is optimized in scheduling problems. Solving the scheduling problem is equivalent to finding out when each task starts and which resource executes it. In general, the scheduling problems are NP-Hard. Consequently, there exists no known algorithm that can solve the problem by executing a polynomial number of instructions. Nonetheless, there exist specializations for scheduling problems that are not NP-Complete. Such problems can be solved in polynomial time using dedicated algorithms. We tackle such algorithms for scheduling problems in a variety of contexts. Filtering techniques are being developed and improved over the past years in constraint-based scheduling. The prominency of filtering algorithms lies on their power to shrink the search tree by excluding values from the domains which do not yield a feasible solution. We propose improvements and present faster filtering algorithms for classical scheduling problems. Furthermore, we establish the adaptions of filtering techniques to the case that the tasks can be delayed. We also consider distinct properties of industrial scheduling problems and solve more efficiently the scheduling problems whose optimization criteria is not necessarily the makespan. For instance, we present polynomial time algorithms for the case that the amount of available resources fluctuates over time, or when the cost of executing a task at time t is dependent on t

    Symmetry Breaking Constraints: Recent Results

    Full text link
    Symmetry is an important problem in many combinatorial problems. One way of dealing with symmetry is to add constraints that eliminate symmetric solutions. We survey recent results in this area, focusing especially on two common and useful cases: symmetry breaking constraints for row and column symmetry, and symmetry breaking constraints for eliminating value symmetryComment: To appear in Proceedings of Twenty-Sixth Conference on Artificial Intelligence (AAAI-12
    corecore