33 research outputs found

    The combinatorics of minimal unsatisfiability: connecting to graph theory

    Get PDF
    Minimally Unsatisfiable CNFs (MUs) are unsatisfiable CNFs where removing any clause destroys unsatisfiability. MUs are the building blocks of unsatisfia-bility, and our understanding of them can be very helpful in answering various algorithmic and structural questions relating to unsatisfiability. In this thesis we study MUs from a combinatorial point of view, with the aim of extending the understanding of the structure of MUs. We show that some important classes of MUs are very closely related to known classes of digraphs, and using arguments from logic and graph theory we characterise these MUs.Two main concepts in this thesis are isomorphism of CNFs and the implica-tion digraph of 2-CNFs (at most two literals per disjunction). Isomorphism of CNFs involves renaming the variables, and flipping the literals. The implication digraph of a 2-CNF F has both arcs (¬a → b) and (¬b → a) for every binary clause (a ∨ b) in F .In the first part we introduce a novel connection between MUs and Minimal Strong Digraphs (MSDs), strongly connected digraphs, where removing any arc destroys the strong connectedness. We introduce the new class DFM of special MUs, which are in close correspondence to MSDs. The known relation between 2-CNFs and implication digraphs is used, but in a simpler and more direct way, namely that we have a canonical choice of one of the two arcs. As an application of this new framework we provide short and intuitive new proofs for two im-portant but isolated characterisations for nonsingular MUs (every literal occurs at least twice), both with ingenious but complicated proofs: Characterising 2-MUs (minimally unsatisfiable 2-CNFs), and characterising MUs with deficiency 2 (two more clauses than variables).In the second part, we provide a fundamental addition to the study of 2-CNFs which have efficient algorithms for many interesting problems, namely that we provide a full classification of 2-MUs and a polytime isomorphism de-cision of this class. We show that implication digraphs of 2-MUs are “Weak Double Cycles” (WDCs), big cycles of small cycles (with possible overlaps). Combining logical and graph-theoretical methods, we prove that WDCs have at most one skew-symmetry (a self-inverse fixed-point free anti-symmetry, re-versing the direction of arcs). It follows that the isomorphisms between 2-MUs are exactly the isomorphisms between their implication digraphs (since digraphs with given skew-symmetry are the same as 2-CNFs). This reduces the classifi-cation of 2-MUs to the classification of a nice class of digraphs.Finally in the outlook we discuss further applications, including an alter-native framework for enumerating some special Minimally Unsatisfiable Sub-clause-sets (MUSs)

    Heuristics for the refinement of assumptions in generalized reactivity formulae

    Get PDF
    Reactive synthesis is concerned with automatically generating implementations from formal specifications. These specifications are typically written in the language of generalized reactivity (GR(1)), a subset of linear temporal logic capable of expressing the most common industrial specification patterns, and describe the requirements about the behavior of a system under assumptions about the environment where the system is to be deployed. Oftentimes no implementation exists which guarantees the required behavior under all possible environments, typically due to missing assumptions (this is usually referred to as unrealizability). To address this issue, new assumptions need to be added to complete the specification, a problem known as assumptions refinement. Since the space of candidate assumptions is intractably large, searching for the best solutions is inherently hard. In particular, new methods are needed to (i) increase the effectiveness of the search procedures, measured as the ratio between the number of solutions found and of refinements explored; and (ii) improve the results' quality, defined as the weakness of the solutions. In this thesis we propose a set of heuristics to meet these goals, and a methodology to assess and compare assumptions refinement methods based on quantitative metrics. The heuristics are in the form of algorithms to generate candidate refinements during the search, and quantitative measures to assess the quality of the candidates. We first discuss a heuristic method to generate assumptions that target the cause of unrealizability. This is done by selecting candidate refinement formulas based on Craig's interpolation. We provide a formal underpinning of the technique and evaluate it in terms of our new metric of effectiveness, as defined above, whose value is improved with respect to the state of the art. We demonstrate this on a set of popular benchmarks of embedded software. We then provide a formal, quantitative characterization of the permissiveness of environment assumptions in the form of a weakness measure. We prove that the partial order induced by this measure is consistent with the one induced by implication. The key advantage of this measure is that it allows for prioritizing candidate solutions, as we show experimentally. Lastly, we propose a notion of minimal refinements with respect to the observed counterstrategies. We demonstrate that exploring minimal refinements produces weaker solutions, and reduces the amount of computations needed to explore each refinement. However, this may come at the cost of reducing the effectiveness of the search. To counteract this effect, we propose a hybrid search approach in which both minimal and non-minimal refinements are explored.Open Acces

    Evaluation of XPath Queries against XML Streams

    Get PDF
    XML is nowadays the de facto standard for electronic data interchange on the Web. Available XML data ranges from small Web pages to ever-growing repositories of, e.g., biological and astronomical data, and even to rapidly changing and possibly unbounded streams, as used in Web data integration and publish-subscribe systems. Animated by the ubiquity of XML data, the basic task of XML querying is becoming of great theoretical and practical importance. The last years witnessed efforts as well from practitioners, as also from theoreticians towards defining an appropriate XML query language. At the core of this common effort has been identified a navigational approach for information localization in XML data, comprised in a practical and simple query language called XPath. This work brings together the two aforementioned ``worlds'', i.e., the XPath query evaluation and the XML data streams, and shows as well theoretical as also practical relevance of this fusion. Its relevance can not be subsumed by traditional database management systems, because the latter are not designed for rapid and continuous loading of individual data items, and do not directly support the continuous queries that are typical for stream applications. The first central contribution of this work consists in the definition and the theoretical investigation of three term rewriting systems to rewrite queries with reverse predicates, like parent or ancestor, into equivalent forward queries, i.e., queries without reverse predicates. Our rewriting approach is vital to the evaluation of queries with reverse predicates against unbounded XML streams, because neither the storage of past fragments of the stream, nor several stream traversals, as required by the evaluation of reverse predicates, are affordable. Beyond their declared main purpose of providing equivalences between queries with reverse predicates and forward queries, the applications of our rewriting systems shed light on other query language properties, like the expressivity of some of its fragments, the query minimization, or even the complexity of query evaluation. For example, using these systems, one can rewrite any graph query into an equivalent forward forest query. The second main contribution consists in a streamed and progressive evaluation strategy of forward queries against XML streams. The evaluation is specified using compositions of so-called stream processing functions, and is implemented using networks of deterministic pushdown transducers. The complexity of this evaluation strategy is polynomial in both the query and the data sizes for forward forest queries and even for a large fragment of graph queries. The third central contribution consists in two real monitoring applications that use directly the results of this work: the monitoring of processes running on UNIX computers, and a system for providing graphically real-time traffic and travel information, as broadcasted within ubiquitous radio signals

    35th Symposium on Theoretical Aspects of Computer Science: STACS 2018, February 28-March 3, 2018, Caen, France

    Get PDF

    36th International Symposium on Theoretical Aspects of Computer Science: STACS 2019, March 13-16, 2019, Berlin, Germany

    Get PDF
    corecore