47,216 research outputs found

    Coherent Integration of Databases by Abductive Logic Programming

    Full text link
    We introduce an abductive method for a coherent integration of independent data-sources. The idea is to compute a list of data-facts that should be inserted to the amalgamated database or retracted from it in order to restore its consistency. This method is implemented by an abductive solver, called Asystem, that applies SLDNFA-resolution on a meta-theory that relates different, possibly contradicting, input databases. We also give a pure model-theoretic analysis of the possible ways to `recover' consistent data from an inconsistent database in terms of those models of the database that exhibit as minimal inconsistent information as reasonably possible. This allows us to characterize the `recovered databases' in terms of the `preferred' (i.e., most consistent) models of the theory. The outcome is an abductive-based application that is sound and complete with respect to a corresponding model-based, preferential semantics, and -- to the best of our knowledge -- is more expressive (thus more general) than any other implementation of coherent integration of databases

    OpenJML: Software verification for Java 7 using JML, OpenJDK, and Eclipse

    Full text link
    OpenJML is a tool for checking code and specifications of Java programs. We describe our experience building the tool on the foundation of JML, OpenJDK and Eclipse, as well as on many advances in specification-based software verification. The implementation demonstrates the value of integrating specification tools directly in the software development IDE and in automating as many tasks as possible. The tool, though still in progress, has now been used for several college-level courses on software specification and verification and for small-scale studies on existing Java programs.Comment: In Proceedings F-IDE 2014, arXiv:1404.578

    Logistics outsourcing and 3PL selection: A Case study in an automotive supply chain

    Get PDF
    Outsourcing logistics functions to third-party logistics (3PL) providers has been a source of competitive advantage for most companies. Companies cite greater flexibility, operational efficiency, improved customer service levels, and a better focus on their core businesses as part of the advantages of engaging the services of 3PL providers. There are few complete and structured methodologies for selecting a 3PL provider. This paper discusses how one such methodology, namely the Analytic Hierarchy Process (AHP), is used in an automotive supply chain for export parts to redesign the logistics operations and to select a global logistics service provider

    Integrating multiple criteria decision analysis in participatory forest planning

    Get PDF
    Forest planning in a participatory context often involves multiple stakeholders with conflicting interests. A promising approach for handling these complex situations is to integrate participatory planning and multiple criteria decision analysis (MCDA). The objective of this paper is to analyze strengths and weaknesses of such an integrated approach, focusing on how the use of MCDA has influenced the participatory process. The paper outlines a model for a participatory MCDA process with five steps: stakeholder analysis, structuring of the decision problem, generation of alternatives, elicitation of preferences, and ranking of alternatives. This model was applied in a case study of a planning process for the urban forest in Lycksele, Sweden. In interviews with stakeholders, criteria for four different social groups were identified. Stakeholders also identified specific areas important to them and explained what activities the areas were used for and the forest management they wished for there. Existing forest data were combined with information from interviews to create a map in which the urban forest was divided into zones of different management classes. Three alternative strategic forest plans were produced based on the zonal map. The stakeholders stated their preferences individually by the Analytic Hierarchy Process in inquiry forms and a ranking of alternatives and consistency ratios were determined for each stakeholder. Rankings of alternatives were aggregated; first, for each social group using the arithmetic mean, and then an overall aggregated ranking was calculated from the group rankings using the weighted arithmetic mean. The participatory MCDA process in Lycksele is assessed against five social goals: incorporating public values into decisions, improving the substantive quality of decisions, resolving conflict among competing interests, building trust in institutions, and educating and informing the public. The results and assessment of the case study support the integration of participatory planning and MCDA as a viable option for handling complex forest-management situations. Key issues related to the MCDA methodology that need to be explored further were identified: 1) The handling of place-specific criteria, 2) development of alternatives, 3) the aggregation of individual preferences into a common preference, and 4) application and evaluation of the integrated approach in real case studies

    INCONSISTENCY HANDLING IN MULTIPERSPECTIVE SPECIFICATIONS

    No full text
    Published versio

    Multi-perspective requirements engineering for networked business systems: a framework for pattern composition

    Get PDF
    How business and software analysts explore, document, and negotiate requirements for enterprise systems is critical to the benefits their organizations will eventually derive. In this paper, we present a framework for analysis and redesign of networked business systems. It is based on libraries of patterns which are derived from existing Internet businesses. The framework includes three perspectives: Economic value, Business processes, and Application communication, each of which applies a goal-oriented method to compose patterns. By means of consistency relationships between perspectives, we demonstrate the usefulness of the patterns as a light-weight approach to exploration of business ideas

    A systematic review of data quality issues in knowledge discovery tasks

    Get PDF
    Hay un gran crecimiento en el volumen de datos porque las organizaciones capturan permanentemente la cantidad colectiva de datos para lograr un mejor proceso de toma de decisiones. El desafío mas fundamental es la exploración de los grandes volúmenes de datos y la extracción de conocimiento útil para futuras acciones por medio de tareas para el descubrimiento del conocimiento; sin embargo, muchos datos presentan mala calidad. Presentamos una revisión sistemática de los asuntos de calidad de datos en las áreas del descubrimiento de conocimiento y un estudio de caso aplicado a la enfermedad agrícola conocida como la roya del café.Large volume of data is growing because the organizations are continuously capturing the collective amount of data for better decision-making process. The most fundamental challenge is to explore the large volumes of data and extract useful knowledge for future actions through knowledge discovery tasks, nevertheless many data has poor quality. We presented a systematic review of the data quality issues in knowledge discovery tasks and a case study applied to agricultural disease named coffee rust
    corecore