8 research outputs found

    Rotation-based formulation for stable matching

    Get PDF
    We introduce new CP models for the many-to-many stable matching problem. We use the notion of rotation to give a novel encoding that is linear in the input size of the problem. We give extra filtering rules to maintain arc consistency in quadratic time. Our experimental study on hard instances of sex-equal and balanced stable matching shows the efficiency of one of our propositions as compared with the state-of-the-art constraint programming approach

    A review of literature on parallel constraint solving

    Get PDF
    As multicore computing is now standard, it seems irresponsible for constraints researchers to ignore the implications of it. Researchers need to address a number of issues to exploit parallelism, such as: investigating which constraint algorithms are amenable to parallelisation; whether to use shared memory or distributed computation; whether to use static or dynamic decomposition; and how to best exploit portfolios and cooperating search. We review the literature, and see that we can sometimes do quite well, some of the time, on some instances, but we are far from a general solution. Yet there seems to be little overall guidance that can be given on how best to exploit multicore computers to speed up constraint solving. We hope at least that this survey will provide useful pointers to future researchers wishing to correct this situation

    Tackling Universal Properties of Minimal Trap Spaces of Boolean Networks

    Full text link
    Minimal trap spaces (MTSs) capture subspaces in which the Boolean dynamics is trapped, whatever the update mode. They correspond to the attractors of the most permissive mode. Due to their versatility, the computation of MTSs has recently gained traction, essentially by focusing on their enumeration. In this paper, we address the logical reasoning on universal properties of MTSs in the scope of two problems: the reprogramming of Boolean networks for identifying the permanent freeze of Boolean variables that enforce a given property on all the MTSs, and the synthesis of Boolean networks from universal properties on their MTSs. Both problems reduce to solving the satisfiability of quantified propositional logic formula with 3 levels of quantifiers (\exists\forall\exists). In this paper, we introduce a Counter-Example Guided Refinement Abstraction (CEGAR) to efficiently solve these problems by coupling the resolution of two simpler formulas. We provide a prototype relying on Answer-Set Programming for each formula and show its tractability on a wide range of Boolean models of biological networks.Comment: Accepted at 21st International Conference on Computational Methods in Systems Biology (CMSB 2023

    Decision-Focused Learning: Foundations, State of the Art, Benchmark and Future Opportunities

    Full text link
    Decision-focused learning (DFL) is an emerging paradigm in machine learning which trains a model to optimize decisions, integrating prediction and optimization in an end-to-end system. This paradigm holds the promise to revolutionize decision-making in many real-world applications which operate under uncertainty, where the estimation of unknown parameters within these decision models often becomes a substantial roadblock. This paper presents a comprehensive review of DFL. It provides an in-depth analysis of the various techniques devised to integrate machine learning and optimization models, introduces a taxonomy of DFL methods distinguished by their unique characteristics, and conducts an extensive empirical evaluation of these methods proposing suitable benchmark dataset and tasks for DFL. Finally, the study provides valuable insights into current and potential future avenues in DFL research.Comment: Experimental Survey and Benchmarkin

    Explanation of the Model Checker Verification Results

    Get PDF
    Immer wenn neue Anforderungen an ein System gestellt werden, müssen die Korrektheit und Konsistenz der Systemspezifikation überprüft werden, was in der Praxis in der Regel manuell erfolgt. Eine mögliche Option, um die Nachteile dieser manuellen Analyse zu überwinden, ist das sogenannte Contract-Based Design. Dieser Entwurfsansatz kann den Verifikationsprozess zur Überprüfung, ob die Anforderungen auf oberster Ebene konsistent verfeinert wurden, automatisieren. Die Verifikation kann somit iterativ durchgeführt werden, um die Korrektheit und Konsistenz des Systems angesichts jeglicher Änderung der Spezifikationen sicherzustellen. Allerdings ist es aufgrund der mangelnden Benutzerfreundlichkeit und der Schwierigkeiten bei der Interpretation von Verifizierungsergebnissen immer noch eine Herausforderung, formale Ansätze in der Industrie einzusetzen. Stellt beispielsweise der Model Checker bei der Verifikation eine Inkonsistenz fest, generiert er ein Gegenbeispiel (Counterexample) und weist gleichzeitig darauf hin, dass die gegebenen Eingabespezifikationen inkonsistent sind. Hier besteht die gewaltige Herausforderung darin, das generierte Gegenbeispiel zu verstehen, das oft sehr lang, kryptisch und komplex ist. Darüber hinaus liegt es in der Verantwortung der Ingenieurin bzw. des Ingenieurs, die inkonsistente Spezifikation in einer potenziell großen Menge von Spezifikationen zu identifizieren. Diese Arbeit schlägt einen Ansatz zur Erklärung von Gegenbeispielen (Counterexample Explanation Approach) vor, der die Verwendung von formalen Methoden vereinfacht und fördert, indem benutzerfreundliche Erklärungen der Verifikationsergebnisse der Ingenieurin bzw. dem Ingenieur präsentiert werden. Der Ansatz zur Erklärung von Gegenbeispielen wird mittels zweier Methoden evaluiert: (1) Evaluation anhand verschiedener Anwendungsbeispiele und (2) eine Benutzerstudie in Form eines One-Group Pretest-Posttest Experiments.Whenever new requirements are introduced for a system, the correctness and consistency of the system specification must be verified, which is often done manually in industrial settings. One viable option to traverse disadvantages of this manual analysis is to employ the contract-based design, which can automate the verification process to determine whether the refinements of top-level requirements are consistent. Thus, verification can be performed iteratively to ensure the system’s correctness and consistency in the face of any change in specifications. Having said that, it is still challenging to deploy formal approaches in industries due to their lack of usability and their difficulties in interpreting verification results. For instance, if the model checker identifies inconsistency during the verification, it generates a counterexample while also indicating that the given input specifications are inconsistent. Here, the formidable challenge is to comprehend the generated counterexample, which is often lengthy, cryptic, and complex. Furthermore, it is the engineer’s responsibility to identify the inconsistent specification among a potentially huge set of specifications. This PhD thesis proposes a counterexample explanation approach for formal methods that simplifies and encourages their use by presenting user-friendly explanations of the verification results. The proposed counterexample explanation approach identifies and explains relevant information from the verification result in what seems like a natural language statement. The counterexample explanation approach extracts relevant information by identifying inconsistent specifications from among the set of specifications, as well as erroneous states and variables from the counterexample. The counterexample explanation approach is evaluated using two methods: (1) evaluation with different application examples, and (2) a user-study known as one-group pretest and posttest experiment

    Solving arc routing problems for winter road maintenance operations

    Get PDF
    For winter road maintenance, a fleet of snowplow trucks is operated by government agencies to remove snow and ice on roadways and spread materials for anti-icing, de-icing, or increasing friction. Winter road maintenance is essential for providing safe and efficient service for road users (Usman et al., 2010). It is also costly due to the high cost of equipment, crew, and materials. Optimizing winter road maintenance operations could result in significant cost savings, improved safety and mobility, and reduced environmental and social impacts (Salazar-Aguilar et al., 2012). The first topic in this study focuses on designing routes for winter maintenance trucks from a single depot. Real-world winter road maintenance constraints, including road segment service cycle time, heterogeneous vehicle capacity, fleet size, and road-vehicle dependency, are taken into consideration. The problem is formulated as a variation of the capacitated arc routing problem (CARP) to minimize total travel distance. A metaheuristic algorithm, memetic algorithm (MA), is developed to find nearly optimal solutions. This is the first study that developed the model that includes all the constraints listed. This is the first study that used the MA to solve the routing problem with all those constraints, and the first study that developed the route split procedure that satisfies all those constraints. In addition, a paralleled metaheuristic algorithm is proposed to enhance the solution quality and computation efficiency. The second topic of this study focuses on designing routes from multiple depots with intermediate facilities. The service boundaries of depots are redesigned. Each truck must start and end at its home depot, but they can reload at other depots or reload stations (i.e., intermediate facilities). This problem is a variation of the multi-depot CARP with intermediate facilities (MDCARPIF). The second topic includes all constraints employed in the first topic. Since the trucks can be reloaded at any stations, a constraint that restricts the length of work time for truck drivers is also included in this topic. This is the first study that developed the model that includes all the constraints listed. This is the first study that uses the MA to solve the problem and the first study that developed the route split procedure that satisfies all those constraints. The proposed algorithms are implemented to solve real-world problems. Deadhead (travelling without servicing) speed, service speed, and the spreading rate are estimated by the sample from historical winter road maintenance data. Eighteen traffic networks are used as instances for the first topic. The optimized route in the first topic reduced 13.2% of the deadhead distance comparing to the current practice. Comparing to the single core result, the parallel computation improved the solution fitness on 2 of the 18 instances tested, with slightly less time consumed. Based on the optimized result in the first topic, the reduction of the deadhead distance of the second topic is insignificant. This could be due to the network structure and depot location of the current operation. A test instance is created to verify the effectiveness of the proposed algorithm. The result shows 10.4% of deadhead distance can be saved by using the reload and multiple depot scenario instead of the single depot scenario on the test instance
    corecore