15 research outputs found

    Connected Components and Disjunctive Existential Rules

    Full text link
    In this paper, we explore conjunctive query rewriting, focusing on queries containing universally quantified negation within the framework of disjunctive existential rules. We address the undecidability of the existence of a finite and complete UCQ-rewriting and the identification of finite unification sets (fus) of rules. We introduce new rule classes, connected linear rules and connected domain restricted rules, that exhibit the fus property for existential rules. Additionally, we propose disconnected disjunction for disjunctive existential rules to achieve the fus property when we extend the introduced rule fragments to disjunctive existential rules. We present ECOMPLETO, a system for efficient query rewriting with disjunctive existential rules, capable of handling UCQs with universally quantified negation. Our experiments demonstrate ECOMPLETO's consistent ability to produce finite UCQ-rewritings and describe the performance on different ontologies and queries.Comment: 23 pages, 4 figure

    Effective Use Methods for Continuous Sensor Data Streams in Manufacturing Quality Control

    Get PDF
    This work outlines an approach for managing sensor data streams of continuous numerical data in product manufacturing settings, emphasizing statistical process control, low computational and memory overhead, and saving information necessary to reduce the impact of nonconformance to quality specifications. While there is extensive literature, knowledge, and documentation about standard data sources and databases, the high volume and velocity of sensor data streams often makes traditional analysis unfeasible. To that end, an overview of data stream fundamentals is essential. An analysis of commonly used stream preprocessing and load shedding methods follows, succeeded by a discussion of aggregation procedures. Stream storage and querying systems are the next topics. Further, existing machine learning techniques for data streams are presented, with a focus on regression. Finally, the work describes a novel methodology for managing sensor data streams in which data stream management systems save and record aggregate data from small time intervals, and individual measurements from the stream that are nonconforming. The aggregates shall be continually entered into control charts and regressed on. To conserve memory, old data shall be periodically reaggregated at higher levels to reduce memory consumption

    A Tutorial on Prototyping Internet of Things Devices and Systems: A Gentle Introduction to Technology that Shapes Our Lives

    Get PDF
    The Internet of Things, which has been quietly building and evolving over the past decade, now impacts many aspects of society, including homes, battlefields, and medical communities. Research in information systems, traditionally, has been concentrated on exploring the impacts of such technology, rather than how to actually create systems using it. Although research in design science could especially contribute to the Internet of Things, this type of research from the Information Systems community has been sparse. The most likely cause is the knowledge barriers to learning and understanding this kind of technology development. Recognizing the importance of the continued evolution of the Internet of Things, this paper provides a basic tutorial on how to construct Internet of Things prototypes. The paper is intended to educate Information Systems scholars on how to build their own Internet of Things so they can conduct technical research in this area and instruct their students on how to do the same

    Management of Inconsistencies in Data Integration

    Get PDF
    Data integration aims at providing a unified view over data coming from various sources. One of the most challenging tasks for data integration is handling the inconsistencies that appear in the integrated data in an efficient and effective manner. In this chapter, we provide a survey on techniques introduced for handling inconsistencies in data integration, focusing on two groups. The first group contains techniques for computing consistent query answers, and includes mechanisms for the compact representation of repairs, query rewriting, and logic programs. The second group contains techniques focusing on the resolution of inconsistencies. This includes methodologies for computing similarity between atomic values as well as similarity between groups of data, collective techniques, scaling to large datasets, and dealing with uncertainty that is related to inconsistencies

    Designing Data Spaces

    Get PDF
    This open access book provides a comprehensive view on data ecosystems and platform economics from methodical and technological foundations up to reports from practical implementations and applications in various industries. To this end, the book is structured in four parts: Part I “Foundations and Contexts” provides a general overview about building, running, and governing data spaces and an introduction to the IDS and GAIA-X projects. Part II “Data Space Technologies” subsequently details various implementation aspects of IDS and GAIA-X, including eg data usage control, the usage of blockchain technologies, or semantic data integration and interoperability. Next, Part III describes various “Use Cases and Data Ecosystems” from various application areas such as agriculture, healthcare, industry, energy, and mobility. Part IV eventually offers an overview of several “Solutions and Applications”, eg including products and experiences from companies like Google, SAP, Huawei, T-Systems, Innopay and many more. Overall, the book provides professionals in industry with an encompassing overview of the technological and economic aspects of data spaces, based on the International Data Spaces and Gaia-X initiatives. It presents implementations and business cases and gives an outlook to future developments. In doing so, it aims at proliferating the vision of a social data market economy based on data spaces which embrace trust and data sovereignty

    Pseudo-contractions as Gentle Repairs

    Get PDF
    Updating a knowledge base to remove an unwanted consequence is a challenging task. Some of the original sentences must be either deleted or weakened in such a way that the sentence to be removed is no longer entailed by the resulting set. On the other hand, it is desirable that the existing knowledge be preserved as much as possible, minimising the loss of information. Several approaches to this problem can be found in the literature. In particular, when the knowledge is represented by an ontology, two different families of frameworks have been developed in the literature in the past decades with numerous ideas in common but with little interaction between the communities: applications of AGM-like Belief Change and justification-based Ontology Repair. In this paper, we investigate the relationship between pseudo-contraction operations and gentle repairs. Both aim to avoid the complete deletion of sentences when replacing them with weaker versions is enough to prevent the entailment of the unwanted formula. We show the correspondence between concepts on both sides and investigate under which conditions they are equivalent. Furthermore, we propose a unified notation for the two approaches, which might contribute to the integration of the two areas

    Entity linkage for heterogeneous, uncertain, and volatile data

    Get PDF
    [no abstract
    corecore