3,513 research outputs found

    LiteMat: a scalable, cost-efficient inference encoding scheme for large RDF graphs

    Full text link
    The number of linked data sources and the size of the linked open data graph keep growing every day. As a consequence, semantic RDF services are more and more confronted with various "big data" problems. Query processing in the presence of inferences is one them. For instance, to complete the answer set of SPARQL queries, RDF database systems evaluate semantic RDFS relationships (subPropertyOf, subClassOf) through time-consuming query rewriting algorithms or space-consuming data materialization solutions. To reduce the memory footprint and ease the exchange of large datasets, these systems generally apply a dictionary approach for compressing triple data sizes by replacing resource identifiers (IRIs), blank nodes and literals with integer values. In this article, we present a structured resource identification scheme using a clever encoding of concepts and property hierarchies for efficiently evaluating the main common RDFS entailment rules while minimizing triple materialization and query rewriting. We will show how this encoding can be computed by a scalable parallel algorithm and directly be implemented over the Apache Spark framework. The efficiency of our encoding scheme is emphasized by an evaluation conducted over both synthetic and real world datasets.Comment: 8 pages, 1 figur

    Difficult forms: critical practices of design and research

    Get PDF
    As a kind of 'criticism from within', conceptual and critical design inquire into what design is about – how the market operates, what is considered 'good design', and how the design and development of technology typically works. Tracing relations of conceptual and critical design to (post-)critical architecture and anti-design, we discuss a series of issues related to the operational and intellectual basis for 'critical practice', and how these might open up for a new kind of development of the conceptual and theoretical frameworks of design. Rather than prescribing a practice on the basis of theoretical considerations, these critical practices seem to build an intellectual basis for design on the basis of its own modes of operation, a kind of theoretical development that happens through, and from within, design practice and not by means of external descriptions or analyses of its practices and products

    Subset reasoning for event-based systems

    Get PDF
    In highly dynamic domains such as the Internet of Things (IoT), smart industries, smart manufacturing, pervasive health or social media, data is being continuously generated. By combining this generated data with background knowledge and performing expressive reasoning upon this combination, meaningful decisions can be made. Furthermore, this continuously generated data typically originates from multiple heterogeneous sources. Ontologies are ideal for modeling the domain and facilitates the integration of heterogeneous produced data with background knowledge. Furthermore, expressive ontology reasoning allows to infer implicit facts and enables intelligent decision making. The data produced in these domains is often volatile. Time-critical systems, such as IoT Nurse Call systems, require timely processing of the produced IoT data. However, there is still a mismatch between volatile data and expressive ontology reasoning, since the incoming data frequency is often higher than the reasoning time. For this reason, we present an approximation technique that allows to extract a subset of data to speed-up the reasoning process. We demonstrate this technique in a Nurse Call proof of concept where the locations of the nurses are tracked and the most suited nurse is selected when the patient launches a call and in an extension of an existing benchmark. We managed to speed up the reasoning process up to 10 times for small datasets and up to more than 1000 times for large datasets

    Semantic data integration and knowledge graph creation at scale

    Get PDF
    Contrary to data, knowledge is often abstract. Concrete knowledge can be achieved through the inclusion of semantics in the data models, highlighting the role of data integration. The massive growing number of data, in recent years, has promoted the demand for scaling up data management techniques; materializing data integration, a.k.a., knowledge graph creation falls in that category. In this thesis, we investigate efficient methods and techniques for materializing data integration. We formalize the process of materializing data integration. We formally define the characteristics of a materialized data integration system that merge the data operators and sources. Owing to this formalism, both layers of data integration, including data and schema-level integration, are formalized in the context of mapping assertions. We explore optimization opportunities for improving the materialization of data integration systems. We recognize three angles including intra/inter-mapping assertions from which the materialization can be improved. Accordingly, we propose source-based, mapping-based, and inter-mapping assertion groups of optimization techniques. We utilize our proposed techniques in three real-world projects. We illustrate how applying these optimization techniques contribute to meeting the objectives of the mentioned projects. Furthermore, we study the parameters impacting the performance of materialization of data integration. Relying on reported parameters and the presumably impacting parameters, we build four groups of testbeds. We empirically study the performances of these different testbeds in the presence and absence of our proposed techniques, in terms of execution time. We observe that the savings can be up to 75%. Lastly, we contribute to facilitating the process of declarative data integration system definition. We propose two data operation function signatures in Function Ontology (FnO). The first set of functions is designed to perform the task of entity alignment by resorting to an entity and relation linking tool. The second library consists of domain-specific functions to align genomic entities by harmonizing their representations. Finally, we introduce a tool equipped with a user interface to facilitate the process of defining declarative mapping rules by allowing users to explore the data sources and unified schema while defining their correspondences.Im Gegensatz zu den Daten ist das Wissen oft abstrakt. Konkretes Wissen kann durch die Einbeziehung von Semantik in die Datenmodelle erreicht werden, was die Rolle der Datenintegration unterstreicht. Die massiv wachsende Zahl von Daten hat in den letzten Jahren die Nachfrage nach einer Ausweitung der Datenverwaltungstechnikengef¨ordert; die materialisierende Datenintegration, auch bekannt als die Erstellung von Wissensgraphen, f¨allt in diese Kategorie. In dieser Arbeit untersuchen wir effiziente Methoden und Techniken zur Materialisierung der Datenintegration. Wir formalisieren den Prozess der Materialisierung der Datenintegration. Wir definieren formal die Eigenschaften eines materialisierten Datenintegrationssystems, so dass die Datenoperatoren und -quellen zusammengef¨uhrt werden. Dank dieses Formalismus werden beide Ebenen der Datenintegration, einschließlich der Integration auf Daten- und Schemaebene, im Kontext von Mapping-Assertions formalisiert. Wir untersuchen die Optimierungsm¨oglichkeiten zur Verbesserung der Materialisierung von Datenintegrationssystemen. Wir erkennen drei Gesichtspunkte, einschließlich Intra-/Inter-Mapping-Assertions, unter denen die Materialisierung verbessert werden kann. Dementsprechend schlagen wir quellenbasierte, mappingbasierte und inter-mapping Assertionsgruppen von Optimierungstechniken vor. Wir setzen die von uns vorgeschlagenen Techniken in drei Forschungsprojekte ein. Wir veranschaulichen, wie die Anwendung dieser Optimierungstechniken dazu beitr¨agt, die Ziele der genannten Projekte zu erreichen. Wir untersuchen die Parameter, die sich auf die Leistung der Materialisierung der Datenintegration auswirken. Auf der Grundlage der gemeldeten Parameter und der vermutlich ausschlaggebenden Parameter erstellen wir vier Gruppen von Testumgebungen. Wir untersuchen empirisch die Leistung dieser verschiedenen Testbeds mit und ohne die von uns vorgeschlagenen Techniken in Bezug auf die Ausf¨uhrungszeit. Wir stellen fest, dass die Einsparungen bis zu 75% betragen k¨onnen. Schließlich tragen wir zur Erleichterung des Prozesses der deklarativen Definition von Datenintegrationssystemen bei, indem wir zwei Funktionssignaturen f¨ur Datenoperationen in der Function Ontology (FnO) vorschlagen. Die erste Gruppe von Funktionen ist f¨ur die Aufgabe des Entit¨atsabgleichs konzipiert, w¨ahrend die zweite Bibliothek aus dom¨anenspezifischen Funktionen zum Abgleich genomischer Entit¨aten durch Harmonisierung ihrer Darstellungen besteht. Schließlich stellen wir ein Tool vor, das mit einer Benutzeroberfl¨ache ausgestattet ist, um den Prozess der Definition deklarativer Mapping-Regeln zu erleichtern, indem es den Benutzern erm¨oglicht, die Datenquellen und das einheitliche Schema zu erkunden

    Security that matters: critical infrastructure and objects of protection

    Get PDF
    Critical infrastructure protection is prominently concerned with objects that appear indispensable for the functioning of social and political life. However, the analysis of material objects in discussions of critical infrastructure protection has remained largely within the remit of managerial responses, which see matter as simply passive, a blank slate. In security studies, critical approaches have focused on social and cultural values, forms of life, technologies of risk or structures of neoliberal globalization. This article engages with the role of "things" or of materiality for theories of securitization. Drawing on the materialist feminism of Karen Barad, it shows how critical infrastructure in Europe neither is an empty receptacle of discourse nor has "essential" characteristics; rather, it emerges out of material-discursive practices. Understanding the securitization of critical infrastructure protection as a process of materialization allows for a reconceptualization of how security matters and its effects
    corecore