188 research outputs found

    Minimization for Generalized Boolean Formulas

    Full text link
    The minimization problem for propositional formulas is an important optimization problem in the second level of the polynomial hierarchy. In general, the problem is Sigma-2-complete under Turing reductions, but restricted versions are tractable. We study the complexity of minimization for formulas in two established frameworks for restricted propositional logic: The Post framework allowing arbitrarily nested formulas over a set of Boolean connectors, and the constraint setting, allowing generalizations of CNF formulas. In the Post case, we obtain a dichotomy result: Minimization is solvable in polynomial time or coNP-hard. This result also applies to Boolean circuits. For CNF formulas, we obtain new minimization algorithms for a large class of formulas, and give strong evidence that we have covered all polynomial-time cases

    Faster Query Answering in Probabilistic Databases using Read-Once Functions

    Full text link
    A boolean expression is in read-once form if each of its variables appears exactly once. When the variables denote independent events in a probability space, the probability of the event denoted by the whole expression in read-once form can be computed in polynomial time (whereas the general problem for arbitrary expressions is #P-complete). Known approaches to checking read-once property seem to require putting these expressions in disjunctive normal form. In this paper, we tell a better story for a large subclass of boolean event expressions: those that are generated by conjunctive queries without self-joins and on tuple-independent probabilistic databases. We first show that given a tuple-independent representation and the provenance graph of an SPJ query plan without self-joins, we can, without using the DNF of a result event expression, efficiently compute its co-occurrence graph. From this, the read-once form can already, if it exists, be computed efficiently using existing techniques. Our second and key contribution is a complete, efficient, and simple to implement algorithm for computing the read-once forms (whenever they exist) directly, using a new concept, that of co-table graph, which can be significantly smaller than the co-occurrence graph.Comment: Accepted in ICDT 201

    Learning definite Horn formulas from closure queries

    Get PDF
    A definite Horn theory is a set of n-dimensional Boolean vectors whose characteristic function is expressible as a definite Horn formula, that is, as conjunction of definite Horn clauses. The class of definite Horn theories is known to be learnable under different query learning settings, such as learning from membership and equivalence queries or learning from entailment. We propose yet a different type of query: the closure query. Closure queries are a natural extension of membership queries and also a variant, appropriate in the context of definite Horn formulas, of the so-called correction queries. We present an algorithm that learns conjunctions of definite Horn clauses in polynomial time, using closure and equivalence queries, and show how it relates to the canonical Guigues–Duquenne basis for implicational systems. We also show how the different query models mentioned relate to each other by either showing full-fledged reductions by means of query simulation (where possible), or by showing their connections in the context of particular algorithms that use them for learning definite Horn formulas.Peer ReviewedPostprint (author's final draft

    Generalized fault tree analysis combined with state analysis

    Get PDF

    Optimal and Heuristic Algorithms to Synthesize Lattices of Four-Terminal Switches

    Get PDF
    In this work, we study implementation of Boolean functions with nano-crossbar arrays where each crosspoint behaves as a fourterminal switch controlled by a Boolean literal. These types of arrays are commonly called as switching lattices. We propose optimal and heuristic algorithms that minimize lattice sizes to implement a given Boolean function. The algorithms are mainly constructed on a technique that finds Boolean functions of lattices having independent inputs. This technique works recursively by using transition matrices representing columns and rows of the lattice. It performs symbolic manipulation of Boolean literals as opposed to using truth tables that allows us to successfully find Boolean functions having up to 81 variables corresponding to a 9×9-lattice. With a Boolean function of a certain sized lattice, we check if a given function can be implemented with this lattice size by defining the problem as a satisfiability problem. This process is repeated until a desired solution is found. Additionally, we fix the previously proposed algorithm that is claimed to be optimal. The fixed version guarantees optimal sizes. Finally, we perform synthesis trials on standard benchmark circuits to evaluate the proposed algorithms by considering lattice sizes and runtimes in comparison with the recently proposed three algorithms.This work is supported by the EU-H2020-RISE project NANOxCOMP #691178 and the TUBITAK-Career project #113E760

    Polynomial growth of concept lattices, canonical bases and generators:: extremal set theory in Formal Concept Analysis

    Get PDF
    We prove that there exist three distinct, comprehensive classes of (formal) contexts with polynomially many concepts. Namely: contexts which are nowhere dense, of bounded breadth or highly convex. Already present in G. Birkhoff's classic monograph is the notion of breadth of a lattice; it equals the number of atoms of a largest boolean suborder. Even though it is natural to define the breadth of a context as being that of its concept lattice, this idea had not been exploited before. We do this and establish many equivalences. Amongst them, it is shown that the breadth of a context equals the size of its largest minimal generator, its largest contranominal-scale subcontext, as well as the Vapnik-Chervonenkis dimension of both its system of extents and of intents. The polynomiality of the aforementioned classes is proven via upper bounds (also known as majorants) for the number of maximal bipartite cliques in bipartite graphs. These are results obtained by various authors in the last decades. The fact that they yield statements about formal contexts is a reward for investigating how two established fields interact, specifically Formal Concept Analysis (FCA) and graph theory. We improve considerably the breadth bound. Such improvement is twofold: besides giving a much tighter expression, we prove that it limits the number of minimal generators. This is strictly more general than upper bounding the quantity of concepts. Indeed, it automatically implies a bound on these, as well as on the number of proper premises. A corollary is that this improved result is a bound for the number of implications in the canonical basis too. With respect to the quantity of concepts, this sharper majorant is shown to be best possible. Such fact is established by constructing contexts whose concept lattices exhibit exactly that many elements. These structures are termed, respectively, extremal contexts and extremal lattices. The usual procedure of taking the standard context allows one to work interchangeably with either one of these two extremal structures. Extremal lattices are equivalently defined as finite lattices which have as many elements as possible, under the condition that they obey two upper limits: one for its number of join-irreducibles, other for its breadth. Subsequently, these structures are characterized in two ways. Our first characterization is done using the lattice perspective. Initially, we construct extremal lattices by the iterated operation of finding smaller, extremal subsemilattices and duplicating their elements. Then, it is shown that every extremal lattice must be obtained through a recursive application of this construction principle. A byproduct of this contribution is that extremal lattices are always meet-distributive. Despite the fact that this approach is revealing, the vicinity of its findings contains unanswered combinatorial questions which are relevant. Most notably, the number of meet-irreducibles of extremal lattices escapes from control when this construction is conducted. Aiming to get a grip on the number of meet-irreducibles, we succeed at proving an alternative characterization of these structures. This second approach is based on implication logic, and exposes an interesting link between number of proper premises, pseudo-extents and concepts. A guiding idea in this scenario is to use implications to construct lattices. It turns out that constructing extremal structures with this method is simpler, in the sense that a recursive application of the construction principle is not needed. Moreover, we obtain with ease a general, explicit formula for the Whitney numbers of extremal lattices. This reveals that they are unimodal, too. Like the first, this second construction method is shown to be characteristic. A particular case of the construction is able to force - with precision - a high number of (in the sense of "exponentially many'') meet-irreducibles. Such occasional explosion of meet-irreducibles motivates a generalization of the notion of extremal lattices. This is done by means of considering a more refined partition of the class of all finite lattices. In this finer-grained setting, each extremal class consists of lattices with bounded breadth, number of join irreducibles and meet-irreducibles as well. The generalized problem of finding the maximum number of concepts reveals itself to be challenging. Instead of attempting to classify these structures completely, we pose questions inspired by Turán's seminal result in extremal combinatorics. Most prominently: do extremal lattices (in this more general sense) have the maximum permitted breadth? We show a general statement in this setting: for every choice of limits (breadth, number of join-irreducibles and meet-irreducibles), we produce some extremal lattice with the maximum permitted breadth. The tools which underpin all the intuitions in this scenario are hypergraphs and exact set covers. In a rather unexpected, but interesting turn of events, we obtain for free a simple and interesting theorem about the general existence of "rich'' subcontexts. Precisely: every context contains an object/attribute pair which, after removed, results in a context with at least half the original number of concepts
    corecore