6,174 research outputs found

    A formal foundation for ontology alignment interaction models

    No full text
    Ontology alignment foundations are hard to find in the literature. The abstract nature of the topic and the diverse means of practice makes it difficult to capture it in a universal formal foundation. We argue that such a lack of formality hinders further development and convergence of practices, and in particular, prevents us from achieving greater levels of automation. In this article we present a formal foundation for ontology alignment that is based on interaction models between heterogeneous agents on the Semantic Web. We use the mathematical notion of information flow in a distributed system to ground our three hypotheses of enabling semantic interoperability and we use a motivating example throughout the article: how to progressively align two ontologies of research quality assessment through meaning coordination. We conclude the article with the presentation---in an executable specification language---of such an ontology-alignment interaction model

    Negotiation in Database Schema Integration

    Get PDF
    Databases are playing an increasingly important role in organizations. Timely, accurate access to information has become a critical component of gaining competitive advantage. Data availability is commonly perceived as a critical success factor for an organizationÕs long-term survival, and day-to-day operations can be crippled by failure of the database system to satisfy user requirements. However, a number of emerging issues complicate organizationsÕ ability to provide comprehensive and reliable access to disparate information resources. Further, data accessibility is often compromised due to the typically high cost associated with addressing these issues in practice. Examples of such issues which have emerged in the past decade include the proliferation and investment in autonomous databases within organizations, heterogeneity among data models and database management systems employed, the increasingly important role of distributed systems, and the increasing complexity and knowledge-intensive nature of integrating database schemas. All these factors contribute to the increasing importance of developing feasible options for providing interoperability among existing databases, and therefore, of pursuing research in the area of database schema integration. Indeed, this research focuses specifically on knowledge requirement problems involved in integrating the schema of existing databases in order to provide interoperability and transparent access to disparate information resources without the investment involved in complete systems redesig

    Reasoning Services for the Semantic Grid

    Get PDF
    The Grid aims to support secure, flexible and coordinated resource sharing through providing a middleware platform for advanced distributing computing. Consequently, the Grid’s infrastructural machinery aims to allow collections of any kind of resources—computing, storage, data sets, digital libraries, scientific instruments, people, etc—to easily form Virtual Organisations (VOs) that cross organisational boundaries in order to work together to solve a problem. A Grid depends on understanding the available resources, their capabilities, how to assemble them and how to best exploit them. Thus Grid middleware and the Grid applications they support thrive on the metadata that describes resources in all their forms, the VOs, the policies that drive then and so on, together with the knowledge to apply that metadata intelligently

    Enabling query technologies for the semantic sensor web

    Get PDF
    Sensor networks are increasingly being deployed in the environment for many different purposes. The observations that they produce are made available with heterogeneous schemas, vocabularies and data formats, making it difficult to share and reuse this data, for other purposes than those for which they were originally set up. The authors propose an ontology-based approach for providing data access and query capabilities to streaming data sources, allowing users to express their needs at a conceptual level, independent of implementation and language-specific details. In this article, the authors describe the theoretical foundations and technologies that enable exposing semantically enriched sensor metadata, and querying sensor observations through SPARQL extensions, using query rewriting and data translation techniques according to mapping languages, and managing both pull and push delivery modes

    A core ontological model for semantic sensor web infrastructures

    Get PDF
    Semantic Sensor Web infrastructures use ontology-based models to represent the data that they manage; however, up to now, these ontological models do not allow representing all the characteristics of distributed, heterogeneous, and web-accessible sensor data. This paper describes a core ontological model for Semantic Sensor Web infrastructures that covers these characteristics and that has been built with a focus on reusability. This ontological model is composed of different modules that deal, on the one hand, with infrastructure data and, on the other hand, with data from a specific domain, that is, the coastal flood emergency planning domain. The paper also presents a set of guidelines, followed during the ontological model development, to satisfy a common set of requirements related to modelling domain-specific features of interest and properties. In addition, the paper includes the results obtained after an exhaustive evaluation of the developed ontologies along different aspects (i.e., vocabulary, syntax, structure, semantics, representation, and context)

    Ambient-aware continuous care through semantic context dissemination

    Get PDF
    Background: The ultimate ambient-intelligent care room contains numerous sensors and devices to monitor the patient, sense and adjust the environment and support the staff. This sensor-based approach results in a large amount of data, which can be processed by current and future applications, e. g., task management and alerting systems. Today, nurses are responsible for coordinating all these applications and supplied information, which reduces the added value and slows down the adoption rate. The aim of the presented research is the design of a pervasive and scalable framework that is able to optimize continuous care processes by intelligently reasoning on the large amount of heterogeneous care data. Methods: The developed Ontology-based Care Platform (OCarePlatform) consists of modular components that perform a specific reasoning task. Consequently, they can easily be replicated and distributed. Complex reasoning is achieved by combining the results of different components. To ensure that the components only receive information, which is of interest to them at that time, they are able to dynamically generate and register filter rules with a Semantic Communication Bus (SCB). This SCB semantically filters all the heterogeneous care data according to the registered rules by using a continuous care ontology. The SCB can be distributed and a cache can be employed to ensure scalability. Results: A prototype implementation is presented consisting of a new-generation nurse call system supported by a localization and a home automation component. The amount of data that is filtered and the performance of the SCB are evaluated by testing the prototype in a living lab. The delay introduced by processing the filter rules is negligible when 10 or fewer rules are registered. Conclusions: The OCarePlatform allows disseminating relevant care data for the different applications and additionally supports composing complex applications from a set of smaller independent components. This way, the platform significantly reduces the amount of information that needs to be processed by the nurses. The delay resulting from processing the filter rules is linear in the amount of rules. Distributed deployment of the SCB and using a cache allows further improvement of these performance results

    When Things Matter: A Data-Centric View of the Internet of Things

    Full text link
    With the recent advances in radio-frequency identification (RFID), low-cost wireless sensor devices, and Web technologies, the Internet of Things (IoT) approach has gained momentum in connecting everyday objects to the Internet and facilitating machine-to-human and machine-to-machine communication with the physical world. While IoT offers the capability to connect and integrate both digital and physical entities, enabling a whole new class of applications and services, several significant challenges need to be addressed before these applications and services can be fully realized. A fundamental challenge centers around managing IoT data, typically produced in dynamic and volatile environments, which is not only extremely large in scale and volume, but also noisy, and continuous. This article surveys the main techniques and state-of-the-art research efforts in IoT from data-centric perspectives, including data stream processing, data storage models, complex event processing, and searching in IoT. Open research issues for IoT data management are also discussed
    • …
    corecore