2,378 research outputs found

    A study of existing Ontologies in the IoT-domain

    Get PDF
    Several domains have adopted the increasing use of IoT-based devices to collect sensor data for generating abstractions and perceptions of the real world. This sensor data is multi-modal and heterogeneous in nature. This heterogeneity induces interoperability issues while developing cross-domain applications, thereby restricting the possibility of reusing sensor data to develop new applications. As a solution to this, semantic approaches have been proposed in the literature to tackle problems related to interoperability of sensor data. Several ontologies have been proposed to handle different aspects of IoT-based sensor data collection, ranging from discovering the IoT sensors for data collection to applying reasoning on the collected sensor data for drawing inferences. In this paper, we survey these existing semantic ontologies to provide an overview of the recent developments in this field. We highlight the fundamental ontological concepts (e.g., sensor-capabilities and context-awareness) required for an IoT-based application, and survey the existing ontologies which include these concepts. Based on our study, we also identify the shortcomings of currently available ontologies, which serves as a stepping stone to state the need for a common unified ontology for the IoT domain.Comment: Submitted to Elsevier JWS SI on Web semantics for the Internet/Web of Thing

    Semantics for incident identification and resolution reports

    Get PDF
    In order to achieve a safe and systematic treatment of security protocols, organizations release a number of technical briefings describing how to detect and manage security incidents. A critical issue is that this document set may suffer from semantic deficiencies, mainly due to ambiguity or different granularity levels of description and analysis. An approach to face this problem is the use of semantic methodologies in order to provide better Knowledge Externalization from incident protocols management. In this article, we propose a method based on semantic techniques for both, analyzing and specifying (meta)security requirements on protocols used for solving security incidents. This would allow specialist getting better documentation on their intangible knowledge about them.Ministerio de Economía y Competitividad TIN2013-41086-

    A Maut aprroach for reusing domain ontologies on the basis of the NeOn Methodlogy

    Get PDF
    Knowledge resource reuse has become a popular approach within the ontology engineering field, mainly because it can speed up the ontology development process, saving time and money and promoting the application of good practices. The NeOn Methodology provides guidelines for reuse. These guidelines include the selection of the most appropriate knowledge resources for reuse in ontology development. This is a complex decision-making problem where different conflicting objectives, like the reuse cost, understandability, integration workload and reliability, have to be taken into account simultaneously. GMAA is a PC-based decision support system based on an additive multi-attribute utility model that is intended to allay the operational difficulties involved in the Decision Analysis methodology. The paper illustrates how it can be applied to select multimedia ontologies for reuse to develop a new ontology in the multimedia domain. It also demonstrates that the sensitivity analyses provided by GMAA are useful tools for making a final recommendation

    The Landscape of Ontology Reuse Approaches

    Full text link
    Ontology reuse aims to foster interoperability and facilitate knowledge reuse. Several approaches are typically evaluated by ontology engineers when bootstrapping a new project. However, current practices are often motivated by subjective, case-by-case decisions, which hamper the definition of a recommended behaviour. In this chapter we argue that to date there are no effective solutions for supporting developers' decision-making process when deciding on an ontology reuse strategy. The objective is twofold: (i) to survey current approaches to ontology reuse, presenting motivations, strategies, benefits and limits, and (ii) to analyse two representative approaches and discuss their merits

    Adding value to Linked Open Data using a multidimensional model approach based on the RDF Data Cube vocabulary

    Get PDF
    Most organisations using Open Data currently focus on data processing and analysis. However, although Open Data may be available online, these data are generally of poor quality, thus discouraging others from contributing to and reusing them. This paper describes an approach to publish statistical data from public repositories by using Semantic Web standards published by the W3C, such as RDF and SPARQL, in order to facilitate the analysis of multidimensional models. We have defined a framework based on the entire lifecycle of data publication including a novel step of Linked Open Data assessment and the use of external repositories as knowledge base for data enrichment. As a result, users are able to interact with the data generated according to the RDF Data Cube vocabulary, which makes it possible for general users to avoid the complexity of SPARQL when analysing data. The use case was applied to the Barcelona Open Data platform and revealed the benefits of the application of our approach, such as helping in the decision-making process.This work was supported in part by the Spanish Ministry of Science, Innovation and Universities through the Project ECLIPSE-UA under grant RTI2018-094283-B-C32

    Propagating Data Policies: a User Study

    Get PDF
    When publishing data, data licences are used to specify the actions that are permitted or prohibited, and the duties that target data consumers must comply with. However, in complex environments such as a smart city data portal, multiple data sources are constantly being combined, processed and redistributed. In such a scenario, deciding which policies apply to the output of a process based on the licences attached to its input data is a difficult, knowledge- intensive task. In this paper, we evaluate how automatic reasoning upon semantic representations of policies and of data flows could support decision making on policy propagation. We report on the results of a user study designed to assess both the accuracy and the utility of such a policy-propagation tool, in comparison to a manual approach

    Km4City Ontology Building vs Data Harvesting and Cleaning for Smart-city Services

    Get PDF
    Presently, a very large number of public and private data sets are available from local governments. In most cases, they are not semantically interoperable and a huge human effort would be needed to create integrated ontologies and knowledge base for smart city. Smart City ontology is not yet standardized, and a lot of research work is needed to identify models that can easily support the data reconciliation, the management of the complexity, to allow the data reasoning. In this paper, a system for data ingestion and reconciliation of smart cities related aspects as road graph, services available on the roads, traffic sensors etc., is proposed. The system allows managing a big data volume of data coming from a variety of sources considering both static and dynamic data. These data are mapped to a smart-city ontology, called KM4City (Knowledge Model for City), and stored into an RDF-Store where they are available for applications via SPARQL queries to provide new services to the users via specific applications of public administration and enterprises. The paper presents the process adopted to produce the ontology and the big data architecture for the knowledge base feeding on the basis of open and private data, and the mechanisms adopted for the data verification, reconciliation and validation. Some examples about the possible usage of the coherent big data knowledge base produced are also offered and are accessible from the RDF-Store and related services. The article also presented the work performed about reconciliation algorithms and their comparative assessment and selection
    corecore