546 research outputs found

    The complexity of acyclic conjunctive queries revisited

    Get PDF
    In this paper, we consider first-order logic over unary functions and study the complexity of the evaluation problem for conjunctive queries described by such kind of formulas. A natural notion of query acyclicity for this language is introduced and we study the complexity of a large number of variants or generalizations of acyclic query problems in that context (Boolean or not Boolean, with or without inequalities, comparisons, etc...). Our main results show that all those problems are \textit{fixed-parameter linear} i.e. they can be evaluated in time f(∣Q∣).∣db∣.∣Q(db)∣f(|Q|).|\textbf{db}|.|Q(\textbf{db})| where ∣Q∣|Q| is the size of the query QQ, ∣db∣|\textbf{db}| the database size, ∣Q(db)∣|Q(\textbf{db})| is the size of the output and ff is some function whose value depends on the specific variant of the query problem (in some cases, ff is the identity function). Our results have two kinds of consequences. First, they can be easily translated in the relational (i.e., classical) setting. Previously known bounds for some query problems are improved and new tractable cases are then exhibited. Among others, as an immediate corollary, we improve a result of \~\cite{PapadimitriouY-99} by showing that any (relational) acyclic conjunctive query with inequalities can be evaluated in time f(∣Q∣).∣db∣.∣Q(db)∣f(|Q|).|\textbf{db}|.|Q(\textbf{db})|. A second consequence of our method is that it provides a very natural descriptive approach to the complexity of well-known algorithmic problems. A number of examples (such as acyclic subgraph problems, multidimensional matching, etc...) are considered for which new insights of their complexity are given.Comment: 30 page

    Quantified Constraints in Twenty Seventeen

    Get PDF
    I present a survey of recent advances in the algorithmic and computational complexity theory of non-Boolean Quantified Constraint Satisfaction Problems, incorporating some more modern research directions

    Abstract Learning Frameworks for Synthesis

    Full text link
    We develop abstract learning frameworks (ALFs) for synthesis that embody the principles of CEGIS (counter-example based inductive synthesis) strategies that have become widely applicable in recent years. Our framework defines a general abstract framework of iterative learning, based on a hypothesis space that captures the synthesized objects, a sample space that forms the space on which induction is performed, and a concept space that abstractly defines the semantics of the learning process. We show that a variety of synthesis algorithms in current literature can be embedded in this general framework. While studying these embeddings, we also generalize some of the synthesis problems these instances are of, resulting in new ways of looking at synthesis problems using learning. We also investigate convergence issues for the general framework, and exhibit three recipes for convergence in finite time. The first two recipes generalize current techniques for convergence used by existing synthesis engines. The third technique is a more involved technique of which we know of no existing instantiation, and we instantiate it to concrete synthesis problems

    Counting Problems on Quantum Graphs: Parameterized and Exact Complexity Classifications

    Get PDF
    Quantum graphs, as defined by Lovász in the late 60s, are formal linear combinations of simple graphs with finite support. They allow for the complexity analysis of the problem of computing finite linear combinations of homomorphism counts, the latter of which constitute the foundation of the structural hardness theory for parameterized counting problems: The framework of parameterized counting complexity was introduced by Flum and Grohe, and McCartin in 2002 and forms a hybrid between the classical field of computational counting as founded by Valiant in the late 70s and the paradigm of parameterized complexity theory due to Downey and Fellows which originated in the early 90s. The problem of computing homomorphism numbers of quantum graphs subsumes general motif counting problems and the complexity theoretic implications have only turned out recently in a breakthrough regarding the parameterized subgraph counting problem by Curticapean, Dell and Marx in 2017. We study the problems of counting partially injective and edge-injective homomorphisms, counting induced subgraphs, as well as counting answers to existential first-order queries. We establish novel combinatorial, algebraic and even topological properties of quantum graphs that allow us to provide exhaustive parameterized and exact complexity classifications, including necessary, sufficient and mostly explicit tractability criteria, for all of the previous problems.Diese Arbeit befasst sich mit der Komplexit atsanalyse von mathematischen Problemen die als Linearkombinationen von Graphhomomorphismenzahlen darstellbar sind. Dazu wird sich sogenannter Quantengraphen bedient, bei denen es sich um formale Linearkombinationen von Graphen handelt und welche von Lov asz Ende der 60er eingef uhrt wurden. Die Bestimmung der Komplexit at solcher Probleme erfolgt unter dem von Flum, Grohe und McCartin im Jahre 2002 vorgestellten Paradigma der parametrisierten Z ahlkomplexit atstheorie, die als Hybrid der von Valiant Ende der 70er begr undeten klassischen Z ahlkomplexit atstheorie und der von Downey und Fellows Anfang der 90er eingef uhrten parametrisierten Analyse zu verstehen ist. Die Berechnung von Homomorphismenzahlen zwischen Quantengraphen und Graphen subsumiert im weitesten Sinne all jene Probleme, die das Z ahlen von kleinen Mustern in gro en Strukturen erfordern. Aufbauend auf dem daraus resultierenden Durchbruch von Curticapean, Dell und Marx, das Subgraphz ahlproblem betre end, behandelt diese Arbeit die Analyse der Probleme des Z ahlens von partiell- und kanteninjektiven Homomorphismen, induzierten Subgraphen, und Tre ern von relationalen Datenbankabfragen die sich als existentielle Formeln ausdr ucken lassen. Insbesondere werden dabei neue kombinatorische, algebraische und topologische Eigenschaften von Quantengraphen etabliert, die hinreichende, notwendige und meist explizite Kriterien f ur die Existenz e zienter Algorithmen liefern

    Time-Aware Probabilistic Knowledge Graphs

    Get PDF
    The emergence of open information extraction as a tool for constructing and expanding knowledge graphs has aided the growth of temporal data, for instance, YAGO, NELL and Wikidata. While YAGO and Wikidata maintain the valid time of facts, NELL records the time point at which a fact is retrieved from some Web corpora. Collectively, these knowledge graphs (KG) store facts extracted from Wikipedia and other sources. Due to the imprecise nature of the extraction tools that are used to build and expand KG, such as NELL, the facts in the KG are weighted (a confidence value representing the correctness of a fact). Additionally, NELL can be considered as a transaction time KG because every fact is associated with extraction date. On the other hand, YAGO and Wikidata use the valid time model because they maintain facts together with their validity time (temporal scope). In this paper, we propose a bitemporal model (that combines transaction and valid time models) for maintaining and querying bitemporal probabilistic knowledge graphs. We study coalescing and scalability of marginal and MAP inference. Moreover, we show that complexity of reasoning tasks in atemporal probabilistic KG carry over to the bitemporal setting. Finally, we report our evaluation results of the proposed model
    • …
    corecore