322 research outputs found

    Computing Possible and Certain Answers over Order-Incomplete Data

    Full text link
    This paper studies the complexity of query evaluation for databases whose relations are partially ordered; the problem commonly arises when combining or transforming ordered data from multiple sources. We focus on queries in a useful fragment of SQL, namely positive relational algebra with aggregates, whose bag semantics we extend to the partially ordered setting. Our semantics leads to the study of two main computational problems: the possibility and certainty of query answers. We show that these problems are respectively NP-complete and coNP-complete, but identify tractable cases depending on the query operators or input partial orders. We further introduce a duplicate elimination operator and study its effect on the complexity results.Comment: 55 pages, 56 references. Extended journal version of arXiv:1707.07222. Up to the stylesheet, page/environment numbering, and possible minor publisher-induced changes, this is the exact content of the journal paper that will appear in Theoretical Computer Scienc

    Static Analysis of Deterministic Negotiations

    Full text link
    Negotiation diagrams are a model of concurrent computation akin to workflow Petri nets. Deterministic negotiation diagrams, equivalent to the much studied and used free-choice workflow Petri nets, are surprisingly amenable to verification. Soundness (a property close to deadlock-freedom) can be decided in PTIME. Further, other fundamental questions like computing summaries or the expected cost, can also be solved in PTIME for sound deterministic negotiation diagrams, while they are PSPACE-complete in the general case. In this paper we generalize and explain these results. We extend the classical "meet-over-all-paths" (MOP) formulation of static analysis problems to our concurrent setting, and introduce Mazurkiewicz-invariant analysis problems, which encompass the questions above and new ones. We show that any Mazurkiewicz-invariant analysis problem can be solved in PTIME for sound deterministic negotiations whenever it is in PTIME for sequential flow-graphs---even though the flow-graph of a deterministic negotiation diagram can be exponentially larger than the diagram itself. This gives a common explanation to the low-complexity of all the analysis questions studied so far. Finally, we show that classical gen/kill analyses are also an instance of our framework, and obtain a PTIME algorithm for detecting anti-patterns in free-choice workflow Petri nets. Our result is based on a novel decomposition theorem, of independent interest, showing that sound deterministic negotiation diagrams can be hierarchically decomposed into (possibly overlapping) smaller sound diagrams.Comment: To appear in the Proceedings of LICS 2017, IEEE Computer Societ

    Querying the Guarded Fragment

    Full text link
    Evaluating a Boolean conjunctive query Q against a guarded first-order theory F is equivalent to checking whether "F and not Q" is unsatisfiable. This problem is relevant to the areas of database theory and description logic. Since Q may not be guarded, well known results about the decidability, complexity, and finite-model property of the guarded fragment do not obviously carry over to conjunctive query answering over guarded theories, and had been left open in general. By investigating finite guarded bisimilar covers of hypergraphs and relational structures, and by substantially generalising Rosati's finite chase, we prove for guarded theories F and (unions of) conjunctive queries Q that (i) Q is true in each model of F iff Q is true in each finite model of F and (ii) determining whether F implies Q is 2EXPTIME-complete. We further show the following results: (iii) the existence of polynomial-size conformal covers of arbitrary hypergraphs; (iv) a new proof of the finite model property of the clique-guarded fragment; (v) the small model property of the guarded fragment with optimal bounds; (vi) a polynomial-time solution to the canonisation problem modulo guarded bisimulation, which yields (vii) a capturing result for guarded bisimulation invariant PTIME.Comment: This is an improved and extended version of the paper of the same title presented at LICS 201

    Rewrite based Verification of XML Updates

    Get PDF
    We consider problems of access control for update of XML documents. In the context of XML programming, types can be viewed as hedge automata, and static type checking amounts to verify that a program always converts valid source documents into also valid output documents. Given a set of update operations we are particularly interested by checking safety properties such as preservation of document types along any sequence of updates. We are also interested by the related policy consistency problem, that is detecting whether a sequence of authorized operations can simulate a forbidden one. We reduce these questions to type checking problems, solved by computing variants of hedge automata characterizing the set of ancestors and descendants of the initial document type for the closure of parameterized rewrite rules

    When Can We Answer Queries Using Result-Bounded Data Interfaces?

    Full text link
    We consider answering queries where the underlying data is available only over limited interfaces which provide lookup access to the tuples matching a given binding, but possibly restricting the number of output tuples returned. Interfaces imposing such "result bounds" are common in accessing data via the web. Given a query over a set of relations as well as some integrity constraints that relate the queried relations to the data sources, we examine the problem of deciding if the query is answerable over the interfaces; that is, whether there exists a plan that returns all answers to the query, assuming the source data satisfies the integrity constraints. The first component of our analysis of answerability is a reduction to a query containment problem with constraints. The second component is a set of "schema simplification" theorems capturing limitations on how interfaces with result bounds can be useful to obtain complete answers to queries. These results also help to show decidability for the containment problem that captures answerability, for many classes of constraints. The final component in our analysis of answerability is a "linearization" method, showing that query containment with certain guarded dependencies -- including those that emerge from answerability problems -- can be reduced to query containment for a well-behaved class of linear dependencies. Putting these components together, we get a detailed picture of how to check answerability over result-bounded services.Comment: 45 pages, 2 tables, 43 references. Complete version with proofs of the PODS'18 paper. The main text of this paper is almost identical to the PODS'18 except that we have fixed some small mistakes. Relative to the earlier arXiv version, many errors were corrected, and some terminology has change

    Worst-case Optimal Query Answering for Greedy Sets of Existential Rules and Their Subclasses

    Full text link
    The need for an ontological layer on top of data, associated with advanced reasoning mechanisms able to exploit the semantics encoded in ontologies, has been acknowledged both in the database and knowledge representation communities. We focus in this paper on the ontological query answering problem, which consists of querying data while taking ontological knowledge into account. More specifically, we establish complexities of the conjunctive query entailment problem for classes of existential rules (also called tuple-generating dependencies, Datalog+/- rules, or forall-exists-rules. Our contribution is twofold. First, we introduce the class of greedy bounded-treewidth sets (gbts) of rules, which covers guarded rules, and their most well-known generalizations. We provide a generic algorithm for query entailment under gbts, which is worst-case optimal for combined complexity with or without bounded predicate arity, as well as for data complexity and query complexity. Secondly, we classify several gbts classes, whose complexity was unknown, with respect to combined complexity (with both unbounded and bounded predicate arity) and data complexity to obtain a comprehensive picture of the complexity of existential rule fragments that are based on diverse guardedness notions. Upper bounds are provided by showing that the proposed algorithm is optimal for all of them

    Rewrite Closure and CF Hedge Automata

    Get PDF
    We introduce an extension of hedge automata called bidimensional context-free hedge automata. The class of unranked ordered tree languages they recognize is shown to be preserved by rewrite closure with inverse-monadic rules. We also extend the parameterized rewriting rules used for modeling the W3C XQuery Update Facility in previous works, by the possibility to insert a new parent node above a given node. We show that the rewrite closure of hedge automata languages with these extended rewriting systems are context-free hedge languages

    Tractability and the computational mind

    Get PDF
    We overview logical and computational explanations of the notion of tractability as applied in cognitive science. We start by introducing the basics of mathematical theories of complexity: computability theory, computational complexity theory, and descriptive complexity theory. Computational philosophy of mind often identifies mental algorithms with computable functions. However, with the development of programming practice it has become apparent that for some computable problems finding effective algorithms is hardly possible. Some problems need too much computational resource, e.g., time or memory, to be practically computable. Computational complexity theory is concerned with the amount of resources required for the execution of algorithms and, hence, the inherent difficulty of computational problems. An important goal of computational complexity theory is to categorize computational problems via complexity classes, and in particular, to identify efficiently solvable problems and draw a line between tractability and intractability. We survey how complexity can be used to study computational plausibility of cognitive theories. We especially emphasize methodological and mathematical assumptions behind applying complexity theory in cognitive science. We pay special attention to the examples of applying logical and computational complexity toolbox in different domains of cognitive science. We focus mostly on theoretical and experimental research in psycholinguistics and social cognition
    • …
    corecore