923 research outputs found

    An Empirical Analysis of Search in GSAT

    Full text link
    We describe an extensive study of search in GSAT, an approximation procedure for propositional satisfiability. GSAT performs greedy hill-climbing on the number of satisfied clauses in a truth assignment. Our experiments provide a more complete picture of GSAT's search than previous accounts. We describe in detail the two phases of search: rapid hill-climbing followed by a long plateau search. We demonstrate that when applied to randomly generated 3SAT problems, there is a very simple scaling with problem size for both the mean number of satisfied clauses and the mean branching rate. Our results allow us to make detailed numerical conjectures about the length of the hill-climbing phase, the average gradient of this phase, and to conjecture that both the average score and average branching rate decay exponentially during plateau search. We end by showing how these results can be used to direct future theoretical analysis. This work provides a case study of how computer experiments can be used to improve understanding of the theoretical properties of algorithms.Comment: See http://www.jair.org/ for any accompanying file

    Backbone Fragility and the Local Search Cost Peak

    Full text link
    The local search algorithm WSat is one of the most successful algorithms for solving the satisfiability (SAT) problem. It is notably effective at solving hard Random 3-SAT instances near the so-called `satisfiability threshold', but still shows a peak in search cost near the threshold and large variations in cost over different instances. We make a number of significant contributions to the analysis of WSat on high-cost random instances, using the recently-introduced concept of the backbone of a SAT instance. The backbone is the set of literals which are entailed by an instance. We find that the number of solutions predicts the cost well for small-backbone instances but is much less relevant for the large-backbone instances which appear near the threshold and dominate in the overconstrained region. We show a very strong correlation between search cost and the Hamming distance to the nearest solution early in WSat's search. This pattern leads us to introduce a measure of the backbone fragility of an instance, which indicates how persistent the backbone is as clauses are removed. We propose that high-cost random instances for local search are those with very large backbones which are also backbone-fragile. We suggest that the decay in cost beyond the satisfiability threshold is due to increasing backbone robustness (the opposite of backbone fragility). Our hypothesis makes three correct predictions. First, that the backbone robustness of an instance is negatively correlated with the local search cost when other factors are controlled for. Second, that backbone-minimal instances (which are 3-SAT instances altered so as to be more backbone-fragile) are unusually hard for WSat. Third, that the clauses most often unsatisfied during search are those whose deletion has the most effect on the backbone. In understanding the pathologies of local search methods, we hope to contribute to the development of new and better techniques

    Scalable Parallel Numerical Constraint Solver Using Global Load Balancing

    Full text link
    We present a scalable parallel solver for numerical constraint satisfaction problems (NCSPs). Our parallelization scheme consists of homogeneous worker solvers, each of which runs on an available core and communicates with others via the global load balancing (GLB) method. The parallel solver is implemented with X10 that provides an implementation of GLB as a library. In experiments, several NCSPs from the literature were solved and attained up to 516-fold speedup using 600 cores of the TSUBAME2.5 supercomputer.Comment: To be presented at X10'15 Worksho

    Random Costs in Combinatorial Optimization

    Full text link
    The random cost problem is the problem of finding the minimum in an exponentially long list of random numbers. By definition, this problem cannot be solved faster than by exhaustive search. It is shown that a classical NP-hard optimization problem, number partitioning, is essentially equivalent to the random cost problem. This explains the bad performance of heuristic approaches to the number partitioning problem and allows us to calculate the probability distributions of the optimum and sub-optimum costs.Comment: 4 pages, Revtex, 2 figures (eps), submitted to PR

    Phase Transition in the Number Partitioning Problem

    Full text link
    Number partitioning is an NP-complete problem of combinatorial optimization. A statistical mechanics analysis reveals the existence of a phase transition that separates the easy from the hard to solve instances and that reflects the pseudo-polynomiality of number partitioning. The phase diagram and the value of the typical ground state energy are calculated.Comment: minor changes (references, typos and discussion of results

    On The Complexity and Completeness of Static Constraints for Breaking Row and Column Symmetry

    Full text link
    We consider a common type of symmetry where we have a matrix of decision variables with interchangeable rows and columns. A simple and efficient method to deal with such row and column symmetry is to post symmetry breaking constraints like DOUBLELEX and SNAKELEX. We provide a number of positive and negative results on posting such symmetry breaking constraints. On the positive side, we prove that we can compute in polynomial time a unique representative of an equivalence class in a matrix model with row and column symmetry if the number of rows (or of columns) is bounded and in a number of other special cases. On the negative side, we show that whilst DOUBLELEX and SNAKELEX are often effective in practice, they can leave a large number of symmetric solutions in the worst case. In addition, we prove that propagating DOUBLELEX completely is NP-hard. Finally we consider how to break row, column and value symmetry, correcting a result in the literature about the safeness of combining different symmetry breaking constraints. We end with the first experimental study on how much symmetry is left by DOUBLELEX and SNAKELEX on some benchmark problems.Comment: To appear in the Proceedings of the 16th International Conference on Principles and Practice of Constraint Programming (CP 2010

    Symmetry breaking in numeric constraint problems

    Get PDF
    Symmetry-breaking constraints in the form of inequalities between variables have been proposed for a few kind of solution symmetries in numeric CSPs. We show that, for the variable symmetries among those, the proposed inequalities are but a specific case of a relaxation of the well-known LEX constraints extensively used for discrete CSPs. We discuss the merits of this relaxation and present experimental evidences of its practical interest.Postprint (author’s final draft

    Phase Transition in Multiprocessor Scheduling

    Full text link
    The problem of distributing the workload on a parallel computer to minimize the overall runtime is known as Multiprocessor Scheduling Problem. It is NP-hard, but like many other NP-hard problems, the average hardness of random instances displays an ``easy-hard'' phase transition. The transition in Multiprocessor Scheduling can be analyzed using elementary notions from crystallography (Bravais lattices) and statistical mechanics (Potts vectors). The analysis reveals the control parameter of the transition and its critical value including finite size corrections. The transition is identified in the performance of practical scheduling algorithms.Comment: 6 pages, revtex

    Phase transition for cutting-plane approach to vertex-cover problem

    Full text link
    We study the vertex-cover problem which is an NP-hard optimization problem and a prototypical model exhibiting phase transitions on random graphs, e.g., Erdoes-Renyi (ER) random graphs. These phase transitions coincide with changes of the solution space structure, e.g, for the ER ensemble at connectivity c=e=2.7183 from replica symmetric to replica-symmetry broken. For the vertex-cover problem, also the typical complexity of exact branch-and-bound algorithms, which proceed by exploring the landscape of feasible configurations, change close to this phase transition from "easy" to "hard". In this work, we consider an algorithm which has a completely different strategy: The problem is mapped onto a linear programming problem augmented by a cutting-plane approach, hence the algorithm operates in a space OUTSIDE the space of feasible configurations until the final step, where a solution is found. Here we show that this type of algorithm also exhibits an "easy-hard" transition around c=e, which strongly indicates that the typical hardness of a problem is fundamental to the problem and not due to a specific representation of the problem.Comment: 4 pages, 3 figure
    • …
    corecore