315 research outputs found

    Evaluating QBF Solvers: Quantifier Alternations Matter

    Full text link
    We present an experimental study of the effects of quantifier alternations on the evaluation of quantified Boolean formula (QBF) solvers. The number of quantifier alternations in a QBF in prenex conjunctive normal form (PCNF) is directly related to the theoretical hardness of the respective QBF satisfiability problem in the polynomial hierarchy. We show empirically that the performance of solvers based on different solving paradigms substantially varies depending on the numbers of alternations in PCNFs. In related theoretical work, quantifier alternations have become the focus of understanding the strengths and weaknesses of various QBF proof systems implemented in solvers. Our results motivate the development of methods to evaluate orthogonal solving paradigms by taking quantifier alternations into account. This is necessary to showcase the broad range of existing QBF solving paradigms for practical QBF applications. Moreover, we highlight the potential of combining different approaches and QBF proof systems in solvers.Comment: preprint of a paper to be published at CP 2018, LNCS, Springer, including appendi

    ASlib: A Benchmark Library for Algorithm Selection

    Full text link
    The task of algorithm selection involves choosing an algorithm from a set of algorithms on a per-instance basis in order to exploit the varying performance of algorithms over a set of instances. The algorithm selection problem is attracting increasing attention from researchers and practitioners in AI. Years of fruitful applications in a number of domains have resulted in a large amount of data, but the community lacks a standard format or repository for this data. This situation makes it difficult to share and compare different approaches effectively, as is done in other, more established fields. It also unnecessarily hinders new researchers who want to work in this area. To address this problem, we introduce a standardized format for representing algorithm selection scenarios and a repository that contains a growing number of data sets from the literature. Our format has been designed to be able to express a wide variety of different scenarios. Demonstrating the breadth and power of our platform, we describe a set of example experiments that build and evaluate algorithm selection models through a common interface. The results display the potential of algorithm selection to achieve significant performance improvements across a broad range of problems and algorithms.Comment: Accepted to be published in Artificial Intelligence Journa

    DepQBF 6.0: A Search-Based QBF Solver Beyond Traditional QCDCL

    Full text link
    We present the latest major release version 6.0 of the quantified Boolean formula (QBF) solver DepQBF, which is based on QCDCL. QCDCL is an extension of the conflict-driven clause learning (CDCL) paradigm implemented in state of the art propositional satisfiability (SAT) solvers. The Q-resolution calculus (QRES) is a QBF proof system which underlies QCDCL. QCDCL solvers can produce QRES proofs of QBFs in prenex conjunctive normal form (PCNF) as a byproduct of the solving process. In contrast to traditional QCDCL based on QRES, DepQBF 6.0 implements a variant of QCDCL which is based on a generalization of QRES. This generalization is due to a set of additional axioms and leaves the original Q-resolution rules unchanged. The generalization of QRES enables QCDCL to potentially produce exponentially shorter proofs than the traditional variant. We present an overview of the features implemented in DepQBF and report on experimental results which demonstrate the effectiveness of generalized QRES in QCDCL.Comment: 12 pages + appendix; to appear in the proceedings of CADE-26, LNCS, Springer, 201

    Size, Cost and Capacity: A Semantic Technique for Hard Random QBFs

    Get PDF
    As a natural extension of the SAT problem, an array of proof systems for quantified Boolean formulas (QBF) have been proposed, many of which extend a propositional proof system to handle universal quantification. By formalising the construction of the QBF proof system obtained from a propositional proof system by adding universal reduction (Beyersdorff, Bonacina & Chew, ITCS'16), we present a new technique for proving proof-size lower bounds in these systems. The technique relies only on two semantic measures: the cost of a QBF, and the capacity of a proof. By examining the capacity of proofs in several QBF systems, we are able to use the technique to obtain lower bounds based on cost alone. As applications of the technique, we first prove exponential lower bounds for a new family of simple QBFs representing equality. The main application is in proving exponential lower bounds with high probability for a class of randomly generated QBFs, the first 'genuine' lower bounds of this kind, which apply to the QBF analogues of resolution, Cutting Planes, and Polynomial Calculus. Finally, we employ the technique to give a simple proof of hardness for a prominent family of QBFs

    Understanding and Extending Incremental Determinization for 2QBF

    Full text link
    Incremental determinization is a recently proposed algorithm for solving quantified Boolean formulas with one quantifier alternation. In this paper, we formalize incremental determinization as a set of inference rules to help understand the design space of similar algorithms. We then present additional inference rules that extend incremental determinization in two ways. The first extension integrates the popular CEGAR principle and the second extension allows us to analyze different cases in isolation. The experimental evaluation demonstrates that the extensions significantly improve the performance

    Applying Formal Methods to Networking: Theory, Techniques and Applications

    Full text link
    Despite its great importance, modern network infrastructure is remarkable for the lack of rigor in its engineering. The Internet which began as a research experiment was never designed to handle the users and applications it hosts today. The lack of formalization of the Internet architecture meant limited abstractions and modularity, especially for the control and management planes, thus requiring for every new need a new protocol built from scratch. This led to an unwieldy ossified Internet architecture resistant to any attempts at formal verification, and an Internet culture where expediency and pragmatism are favored over formal correctness. Fortunately, recent work in the space of clean slate Internet design---especially, the software defined networking (SDN) paradigm---offers the Internet community another chance to develop the right kind of architecture and abstractions. This has also led to a great resurgence in interest of applying formal methods to specification, verification, and synthesis of networking protocols and applications. In this paper, we present a self-contained tutorial of the formidable amount of work that has been done in formal methods, and present a survey of its applications to networking.Comment: 30 pages, submitted to IEEE Communications Surveys and Tutorial

    Backdoors to Normality for Disjunctive Logic Programs

    Full text link
    Over the last two decades, propositional satisfiability (SAT) has become one of the most successful and widely applied techniques for the solution of NP-complete problems. The aim of this paper is to investigate theoretically how Sat can be utilized for the efficient solution of problems that are harder than NP or co-NP. In particular, we consider the fundamental reasoning problems in propositional disjunctive answer set programming (ASP), Brave Reasoning and Skeptical Reasoning, which ask whether a given atom is contained in at least one or in all answer sets, respectively. Both problems are located at the second level of the Polynomial Hierarchy and thus assumed to be harder than NP or co-NP. One cannot transform these two reasoning problems into SAT in polynomial time, unless the Polynomial Hierarchy collapses. We show that certain structural aspects of disjunctive logic programs can be utilized to break through this complexity barrier, using new techniques from Parameterized Complexity. In particular, we exhibit transformations from Brave and Skeptical Reasoning to SAT that run in time O(2^k n^2) where k is a structural parameter of the instance and n the input size. In other words, the reduction is fixed-parameter tractable for parameter k. As the parameter k we take the size of a smallest backdoor with respect to the class of normal (i.e., disjunction-free) programs. Such a backdoor is a set of atoms that when deleted makes the program normal. In consequence, the combinatorial explosion, which is expected when transforming a problem from the second level of the Polynomial Hierarchy to the first level, can now be confined to the parameter k, while the running time of the reduction is polynomial in the input size n, where the order of the polynomial is independent of k.Comment: A short version will appear in the Proceedings of the Proceedings of the 27th AAAI Conference on Artificial Intelligence (AAAI'13). A preliminary version of the paper was presented on the workshop Answer Set Programming and Other Computing Paradigms (ASPOCP 2012), 5th International Workshop, September 4, 2012, Budapest, Hungar
    • …
    corecore