18 research outputs found

    Updates by Reasoning about States

    Full text link
    It has been argued that some sort of control must be introduced in order to perform update operations in deductive databases. Indeed, many approaches rely on a procedural semantics of rule based languages and often perform updates as side-effects. Depending on the evaluation procedure, updates are generally performed in the body (top-down evaluation) or in the head of rules (bottom-up evaluation). We demonstrate that updates can be specified in a purely declarative manner using standard model based semantics without relying on procedural aspects of program evaluation. The key idea is to incorporate states as first-class objects into the language. This is the source of the additional expressiveness needed to define updates. We introduce the update language Statelog+-, discuss various domains of application and outline how to implement computation of the perfect model semantics for Statelog+- programs

    Complex transitive closure queries on a fragmented graph

    Get PDF
    In this paper we study the reformulation of transitive closure queries on a fragmented graph. We split a query into several subqueries, each requiring only a fragment of the graph. We prove this reformulation to be correct for shortest path and bill of material queries. Here we describe the reformulation for an abstract graph, elsewhere we have described an actual implementation of our approach and some promising simulation results.\ud \ud We view the study of distributed computation of transitive closure queries as a result of the trend towards distributed computation. First selections were distributed to fragments of a relation, then fragmentation was used to compute joins in a distributed way, and now we are studying distributed computation of transitive closure queries. This should result in a deeper insight into the use and possible benefit of parallelism. Our work may be used in ordinary distributed databases as well as advanced multiprocessor database machines, such as PRISMA.\ud \ud Although this research was started to efficiently use distributed computation, it turns out to be beneficiary in a central environment as well. This is due to the introduction of extra selections, stemming from an appropriate fragmentation. This leads to extra focus on relevant data

    A survey of parallel execution strategies for transitive closure and logic programs

    Get PDF
    An important feature of database technology of the nineties is the use of parallelism for speeding up the execution of complex queries. This technology is being tested in several experimental database architectures and a few commercial systems for conventional select-project-join queries. In particular, hash-based fragmentation is used to distribute data to disks under the control of different processors in order to perform selections and joins in parallel. With the development of new query languages, and in particular with the definition of transitive closure queries and of more general logic programming queries, the new dimension of recursion has been added to query processing. Recursive queries are complex; at the same time, their regular structure is particularly suited for parallel execution, and parallelism may give a high efficiency gain. We survey the approaches to parallel execution of recursive queries that have been presented in the recent literature. We observe that research on parallel execution of recursive queries is separated into two distinct subareas, one focused on the transitive closure of Relational Algebra expressions, the other one focused on optimization of more general Datalog queries. Though the subareas seem radically different because of the approach and formalism used, they have many common features. This is not surprising, because most typical Datalog queries can be solved by means of the transitive closure of simple algebraic expressions. We first analyze the relationship between the transitive closure of expressions in Relational Algebra and Datalog programs. We then review sequential methods for evaluating transitive closure, distinguishing iterative and direct methods. We address the parallelization of these methods, by discussing various forms of parallelization. Data fragmentation plays an important role in obtaining parallel execution; we describe hash-based and semantic fragmentation. Finally, we consider Datalog queries, and present general methods for parallel rule execution; we recognize the similarities between these methods and the methods reviewed previously, when the former are applied to linear Datalog queries. We also provide a quantitative analysis that shows the impact of the initial data distribution on the performance of methods

    Comparison of methods for logic-query implementation

    Get PDF
    AbstractA logic query Q is a triple < G, LP, D, where G is the query goal, LP is a logic program without function symbols, and D is a set of facts, possibly stored as tuples of a relational database. The answers of Q are all facts that can be inferred from LP âˆȘ D and unify with G. A logic query is bound if some argument of the query goal is a constant; it is canonical strongly linear (a CSL query) if LP contains exactly one recursive rule and this rule is linear, i.e., only one recursive predicate occurs in its body. In this paper, the problem of finding the answers of a bound CSL query is studied with the aim of comparing for efficiency some well-known methods for implementing logic queries: the eager method, the counting method, and the magic-set method. It is shown that the above methods can be expressed as algorithms for finding particular paths in a directed graph associated to the query. Within this graphical formalism, a worst-case complexity analysis of the three methods is performed. It turns out that the counting method has the best upper bound for noncyclic queries. On the other hand, since the counting method is not safe if queries are cyclic, the method is extended to safely implement this kind of queries as well

    Perspektiven: Persistente Objekte mit anwendungsspezifischer Struktur und FunktionalitÀt

    Get PDF
    FlexibilitĂ€t und Anpassbarkeit von Softwarekomponenten sind in den letzten Jahren zu stark diskutierten Themen im Bereich des Software-Engineering geworden. Um eine höhere Wiederverwendungsquote zu erreichen als dies in der Vergangenheit möglich war, werden Konzepte gesucht, die eine nachtrĂ€gliche und nicht antizipierte Anpassung von Modulen und Komponenten an geĂ€nderte Anforderungen unterstĂŒtzen. Objektorientiertes Design und dessen Umsetzung durch objekt-orientierte Programmiersprachen bilden zwar einen Schritt in diese Richtung, in dem durch die Konzepte Kapselung und dynamisches Binden eine dynamische Austauschbarkeit von Implementierungen zur Laufzeit einer Applikation unterstĂŒtzt wird. In der Praxis erfordern traditionelle objektorientierte Programmiersprachen jedoch den Einsatz einer Vielzahl ineinander verzahnter Entwurfsmuster, um dem Anspruch einer nachtrĂ€glichen Adaptierbarkeit gerecht zu werden. Die vorliegende Arbeit schlĂ€gt Sprachkonzepte fĂŒr streng getypte objektorientierte Programmiersprachen vor, welche ein nachtrĂ€gliches „Zuschneidern“ von Klassen und deren bereits existierenden Instanzen unterstĂŒtzen. Damit werden Komponenten und deren Schnittstellen einerseits leicht erweiterbar, andererseits ermöglichen diese Mechanismen auch die VerĂ€nderung und Anreicherung bestehender Implementierungen an neue Gegebenheiten

    RFID: Prospects for Europe: Item-level Tagging and Public Transportation

    Get PDF
    This report, which is part of the COMPLETE series of studies, investigates the current and future competitiveness of the European industry in RFID applications in general and in two specific cases: item-level tagging and public transportation. It analyses its constituent technologies, drivers and barriers to growth, actual and potential markets and economic impacts, the industrial position and innovative capabilities, and it concludes with policy implicationsJRC.DDG.J.4-Information Societ

    Modelling Outcomes of Collaboration in Building Information Modelling Through Gaming Theory Lenses

    Get PDF
    Construction project performance is vulnerable to process fragmentation and weak frameworks for sustaining objectivity and value integration between stakeholders, including clients, involved in the project development processes. For centuries, conventional construction processes have endured the challenges associated with this phenomenon. Several industry reports have suggested this situation is responsive to effective communication, collaboration, thorough integration and a passion for objectivity in data sharing and information management between key players. While entity-based computer-aided design (CAD) lacks the framework to facilitate an effective result in this direction, Building Information Modelling (BIM) has shown the potential for major improvements over the limitations of manual and CAD design methods. Three Game Theory models (Prisoner’s dilemma, Pareto Optima and Hawk-dove) have been proposed to mirror certain implications of players’ actions in BIM environment. In all the gaming lenses used, the study suggests that stakeholders and industry will only benefit when BIM is fully adopted. It has been established that when BIM is partially adopted, the compliant party is likely to benefit more, while the non-compliant party may not necessarily gain the same benefits. The study concluded that BIM means a lot to the industry; the industry cannot afford the consequences of failing to adopt BIM potentials and allied innovations in an era where digital technology is revolutionising other industries. Recommendations are made on areas for further research

    Integrating affordable housing and sustainable housing: bridging two merit goods in Australia

    Get PDF
    Interest among planning and policy makers in environmentally sustainable housing has risen in recent years as a response to the global goal of attaining sustainable development. In Australia, there has long been concern that the market might under-provide affordable housing and, more recently, concerns have been raised over the capacity of the market to provide sustainable housing. Governments in Australia have intervened through subsidies, tax incentives and more direct forms of support for the provision of affordable and sustainable housing. Providing environmentally sustainable housing is thus perceived to be a “merit good” in Australia. That is, a good that has social merit but one that is underprovided by markets. Contemporary housing policy debate in Australia has emphasised the need to respond to a growing housing affordability challenge. Affordable housing might also be seen to be a merit good in Australia. Nevertheless there has been a reluctance to consider housing sustainability in the same context as housing affordability. This chapter addresses the debate over affordable and sustainable housing in Australia by drawing on learnings from the Ecocents Living research project to suggest a conceptual basis to understand the issues at hand. Ecocents Living is a project that seeks to integrate the concepts of affordable and sustainable housing into a model to guide industrial implementation of sustainable and affordable housing. It is argued that the concepts of sustainable housing and affordable housing have synergies that warrant consideration and the further development of an embryonic model for integrating sustainable and affordable housing is offered in this chapter.George Zillante, Stephen Pullen, Lou Wilson, Kathryn Davidson, Nicholas Chileshe, Jian Zuo, Michael Arma
    corecore