58 research outputs found

    Effectiveness of pre- and inprocessing for CDCL-based SAT solving

    Get PDF
    Applying pre- and inprocessing techniques to simplify CNF formulas both before and during search can considerably improve the performance of modern SAT solvers. These algorithms mostly aim at reducing the number of clauses, literals, and variables in the formula. However, to be worthwhile, it is necessary that their additional runtime does not exceed the runtime saved during the subsequent SAT solver execution. In this paper we investigate the efficiency and the practicability of selected simplification algorithms for CDCL-based SAT solving. We first analyze them by means of their expected impact on the CNF formula and SAT solving at all. While testing them on real-world and combinatorial SAT instances, we show which techniques and combinations of them yield a desirable speedup and which ones should be avoided

    Design, implementation and evaluation of a distributed CDCL framework

    Get PDF
    The primary subject of this dissertation is practically solving instances of the Boolean satisfiability problem (SAT) that arise from industrial applications. The invention of the conflict-driven clause-learning (CDCL) algorithm led to enormous progress in this field. CDCL has been augmented with effective pre- and inprocessing techniques that boost its effectiveness. While a considerable amount of work has been done on applying shared-memory parallelism to enhance the performance of CDCL, solving SAT on distributed architectures is studied less thoroughly. In this work, we develop a distributed, CDCL-based framework for SAT solving. This framework consists of three main components: 1. An implementation of the CDCL algorithm that we have written from scratch, 2. a novel, parallel SAT algorithm that builds upon this CDCL implementation and 3. a collection of parallel simplification techniques for SAT instances. We call our resulting framework satUZK; our parallel solving algorithm is called the distributed divide-and-conquer (DDC) algorithm. The DDC algorithm employs a parallel lookahead procedure to dynamically partition the search space. Load balancing is used to ensure that all computational resources are utilized during lookahead. This procedure results in a divide-and-conquer tree that is distributed over all processors. Individual threads are routed through this tree until they arrive at unsolved leaf vertices. Upon arrival, the lookahead procedure is invoked again or the leaf vertex is solved via CDCL. Several extensions to the DDC algorithm are proposed. These include clause sharing and a scheme to locally adjust the LBD score relative to the current search tree vertex. LBD is a measure for the usefulness of clauses that participate in a CDCL search. We evaluate our DDC algorithm empirically and benchmark it against the best distributed SAT algorithms. In this experiment, our DDC algorithm is faster than other distributed, state-of-the-art solvers and solves at least as many instances. In addition to running a parallel algorithm for SAT solving we also consider parallel simplifcation. Here, we first develop a theoretical foundation that allows us to prove the correctness of parallel simplification techniques. Using this as a basis, we examine established simplification algorithms for their parallelizability. It turns out that several well-known simplification techniques can be parallelized efficiently. We provide parallel implementation of the techniques and test their effectiveness in empirical experiments. This evaluation finds several combinations of simplification techniques that can solve instances which could not be solved by the DDC algorithm alone

    GPU Enabled Automated Reasoning

    Get PDF

    Recognition and Exploitation of Gate Structure in SAT Solving

    Get PDF
    In der theoretischen Informatik ist das SAT-Problem der archetypische Vertreter der Klasse der NP-vollständigen Probleme, weshalb effizientes SAT-Solving im Allgemeinen als unmöglich angesehen wird. Dennoch erzielt man in der Praxis oft erstaunliche Resultate, wo einige Anwendungen Probleme mit Millionen von Variablen erzeugen, die von neueren SAT-Solvern in angemessener Zeit gelöst werden können. Der Erfolg von SAT-Solving in der Praxis ist auf aktuelle Implementierungen des Conflict Driven Clause-Learning (CDCL) Algorithmus zurückzuführen, dessen Leistungsfähigkeit weitgehend von den verwendeten Heuristiken abhängt, welche implizit die Struktur der in der industriellen Praxis erzeugten Instanzen ausnutzen. In dieser Arbeit stellen wir einen neuen generischen Algorithmus zur effizienten Erkennung der Gate-Struktur in CNF-Encodings von SAT Instanzen vor, und außerdem drei Ansätze, in denen wir diese Struktur explizit ausnutzen. Unsere Beiträge umfassen auch die Implementierung dieser Ansätze in unserem SAT-Solver Candy und die Entwicklung eines Werkzeugs für die verteilte Verwaltung von Benchmark-Instanzen und deren Attribute, der Global Benchmark Database (GBD)

    Proceedings of SAT Competition 2020 : Solver and Benchmark Descriptions

    Get PDF

    Proceedings of SAT Competition 2020 : Solver and Benchmark Descriptions

    Get PDF
    Non peer reviewe

    On the Hardness of SAT with Community Structure

    Full text link
    Recent attempts to explain the effectiveness of Boolean satisfiability (SAT) solvers based on conflict-driven clause learning (CDCL) on large industrial benchmarks have focused on the concept of community structure. Specifically, industrial benchmarks have been empirically found to have good community structure, and experiments seem to show a correlation between such structure and the efficiency of CDCL. However, in this paper we establish hardness results suggesting that community structure is not sufficient to explain the success of CDCL in practice. First, we formally characterize a property shared by a wide class of metrics capturing community structure, including "modularity". Next, we show that the SAT instances with good community structure according to any metric with this property are still NP-hard. Finally, we study a class of random instances generated from the "pseudo-industrial" community attachment model of Gir\'aldez-Cru and Levy. We prove that, with high probability, instances from this model that have relatively few communities but are still highly modular require exponentially long resolution proofs and so are hard for CDCL. We also present experimental evidence that our result continues to hold for instances with many more communities. This indicates that actual industrial instances easily solved by CDCL may have some other relevant structure not captured by the community attachment model.Comment: 23 pages. Full version of a SAT 2016 pape

    Proceedings of SAT Competition 2017 : Solver and Benchmark Descriptions

    Get PDF
    corecore