252,070 research outputs found

    Reverse engineering to achieve maintainable WWW sites

    Get PDF
    The growth of the World Wide Web and the accelerated development of web sites and associated web technologies has resulted in a variety of maintenance problems. The maintenance problems associated with web sites and the WWW are examined. It is argued that currently web sites and the WWW lack both data abstractions and structures that could facilitate maintenance. A system to analyse existing web sites and extract duplicated content and style is described here. In designing the system, existing Reverse Engineering techniques have been applied, and a case for further application of these techniques is made in order to prepare sites for their inevitable evolution in futur

    Datalog± Ontology Consolidation

    Get PDF
    Knowledge bases in the form of ontologies are receiving increasing attention as they allow to clearly represent both the available knowledge, which includes the knowledge in itself and the constraints imposed to it by the domain or the users. In particular, Datalog ± ontologies are attractive because of their property of decidability and the possibility of dealing with the massive amounts of data in real world environments; however, as it is the case with many other ontological languages, their application in collaborative environments often lead to inconsistency related issues. In this paper we introduce the notion of incoherence regarding Datalog± ontologies, in terms of satisfiability of sets of constraints, and show how under specific conditions incoherence leads to inconsistent Datalog ± ontologies. The main contribution of this work is a novel approach to restore both consistency and coherence in Datalog± ontologies. The proposed approach is based on kernel contraction and restoration is performed by the application of incision functions that select formulas to delete. Nevertheless, instead of working over minimal incoherent/inconsistent sets encountered in the ontologies, our operators produce incisions over non-minimal structures called clusters. We present a construction for consolidation operators, along with the properties expected to be satisfied by them. Finally, we establish the relation between the construction and the properties by means of a representation theorem. Although this proposal is presented for Datalog± ontologies consolidation, these operators can be applied to other types of ontological languages, such as Description Logics, making them apt to be used in collaborative environments like the Semantic Web.Fil: Deagustini, Cristhian Ariel David. Consejo Nacional de Investigaciones Científicas y Técnicas. Centro Científico Tecnológico Conicet - Bahía Blanca. Instituto de Ciencias e Ingeniería de la Computación. Universidad Nacional del Sur. Departamento de Ciencias e Ingeniería de la Computación. Instituto de Ciencias e Ingeniería de la Computación; ArgentinaFil: Martinez, Maria Vanina. Consejo Nacional de Investigaciones Científicas y Técnicas. Centro Científico Tecnológico Conicet - Bahía Blanca. Instituto de Ciencias e Ingeniería de la Computación. Universidad Nacional del Sur. Departamento de Ciencias e Ingeniería de la Computación. Instituto de Ciencias e Ingeniería de la Computación; ArgentinaFil: Falappa, Marcelo Alejandro. Consejo Nacional de Investigaciones Científicas y Técnicas. Centro Científico Tecnológico Conicet - Bahía Blanca. Instituto de Ciencias e Ingeniería de la Computación. Universidad Nacional del Sur. Departamento de Ciencias e Ingeniería de la Computación. Instituto de Ciencias e Ingeniería de la Computación; ArgentinaFil: Simari, Guillermo Ricardo. Consejo Nacional de Investigaciones Científicas y Técnicas. Centro Científico Tecnológico Conicet - Bahía Blanca. Instituto de Ciencias e Ingeniería de la Computación. Universidad Nacional del Sur. Departamento de Ciencias e Ingeniería de la Computación. Instituto de Ciencias e Ingeniería de la Computación; Argentin

    Evaluation of the Children’s Commissioning Support Resource. Final report

    Get PDF
    It is clear that the CCSR has not fully realised its initial goals and there are a range of factors preventing it from doing so. - The interlinked nature of the issues influencing its success means that it is difficult at this stage to set a specific path towards improved effectiveness. It is clear however, that the first and major step required is to develop much clearer ownership, leadership and direction

    Coherent Integration of Databases by Abductive Logic Programming

    Full text link
    We introduce an abductive method for a coherent integration of independent data-sources. The idea is to compute a list of data-facts that should be inserted to the amalgamated database or retracted from it in order to restore its consistency. This method is implemented by an abductive solver, called Asystem, that applies SLDNFA-resolution on a meta-theory that relates different, possibly contradicting, input databases. We also give a pure model-theoretic analysis of the possible ways to `recover' consistent data from an inconsistent database in terms of those models of the database that exhibit as minimal inconsistent information as reasonably possible. This allows us to characterize the `recovered databases' in terms of the `preferred' (i.e., most consistent) models of the theory. The outcome is an abductive-based application that is sound and complete with respect to a corresponding model-based, preferential semantics, and -- to the best of our knowledge -- is more expressive (thus more general) than any other implementation of coherent integration of databases

    Archiving the Relaxed Consistency Web

    Full text link
    The historical, cultural, and intellectual importance of archiving the web has been widely recognized. Today, all countries with high Internet penetration rate have established high-profile archiving initiatives to crawl and archive the fast-disappearing web content for long-term use. As web technologies evolve, established web archiving techniques face challenges. This paper focuses on the potential impact of the relaxed consistency web design on crawler driven web archiving. Relaxed consistent websites may disseminate, albeit ephemerally, inaccurate and even contradictory information. If captured and preserved in the web archives as historical records, such information will degrade the overall archival quality. To assess the extent of such quality degradation, we build a simplified feed-following application and simulate its operation with synthetic workloads. The results indicate that a non-trivial portion of a relaxed consistency web archive may contain observable inconsistency, and the inconsistency window may extend significantly longer than that observed at the data store. We discuss the nature of such quality degradation and propose a few possible remedies.Comment: 10 pages, 6 figures, CIKM 201

    Semantics for Probabilistic Inference

    Full text link
    A number of writers(Joseph Halpern and Fahiem Bacchus among them) have offered semantics for formal languages in which inferences concerning probabilities can be made. Our concern is different. This paper provides a formalization of nonmonotonic inferences in which the conclusion is supported only to a certain degree. Such inferences are clearly 'invalid' since they must allow the falsity of a conclusion even when the premises are true. Nevertheless, such inferences can be characterized both syntactically and semantically. The 'premises' of probabilistic arguments are sets of statements (as in a database or knowledge base), the conclusions categorical statements in the language. We provide standards for both this form of inference, for which high probability is required, and for an inference in which the conclusion is qualified by an intermediate interval of support.Comment: Appears in Proceedings of the Eighth Conference on Uncertainty in Artificial Intelligence (UAI1992
    • …
    corecore