9,911 research outputs found

    The Distributed Ontology Language (DOL): Use Cases, Syntax, and Extensibility

    Full text link
    The Distributed Ontology Language (DOL) is currently being standardized within the OntoIOp (Ontology Integration and Interoperability) activity of ISO/TC 37/SC 3. It aims at providing a unified framework for (1) ontologies formalized in heterogeneous logics, (2) modular ontologies, (3) links between ontologies, and (4) annotation of ontologies. This paper presents the current state of DOL's standardization. It focuses on use cases where distributed ontologies enable interoperability and reusability. We demonstrate relevant features of the DOL syntax and semantics and explain how these integrate into existing knowledge engineering environments.Comment: Terminology and Knowledge Engineering Conference (TKE) 2012-06-20 to 2012-06-21 Madrid, Spai

    Ontology-Based Data Access and Integration

    Get PDF
    An ontology-based data integration (OBDI) system is an information management system consisting of three components: an ontology, a set of data sources, and the mapping between the two. The ontology is a conceptual, formal description of the domain of interest to a given organization (or a community of users), expressed in terms of relevant concepts, attributes of concepts, relationships between concepts, and logical assertions characterizing the domain knowledge. The data sources are the repositories accessible by the organization where data concerning the domain are stored. In the general case, such repositories are numerous, heterogeneous, each one managed and maintained independently from the others. The mapping is a precise specification of the correspondence between the data contained in the data sources and the elements of the ontology. The main purpose of an OBDI system is to allow information consumers to query the data using the elements in the ontology as predicates. In the special case where the organization manages a single data source, the term ontology-based data access (ODBA) system is used

    Semantic reasoning for intelligent emergency response applications

    Get PDF
    Emergency response applications require the processing of large amounts of data, generated by a diverse set of sensors and devices, in order to provide for an accurate and concise view of the situation at hand. The adoption of semantic technologies allows for the definition of a formal domain model and intelligent data processing and reasoning on this model based on generated device and sensor measurements. This paper presents a novel approach to emergency response applications, such as fire fighting, integrating a formal semantic domain model into an event-based decision support system, which supports reasoning on this model. The developed model consists of several generic ontologies describing concepts and properties which can be applied to diverse context-aware applications. These are extended with emergency response specific ontologies. Additionally, inference on the model performed by a reasoning engine is dynamically synchronized with the rest of the architectural components. This allows to automatically trigger events based on predefined conditions. The proposed ontology and developed reasoning methodology is validated on two scenarios, i.e. (i) the construction of an emergency response incident and corresponding scenario and (ii) monitoring of the state of a fire fighter during an emergency response

    Semantic process mining tools: core building blocks

    Get PDF
    Process mining aims at discovering new knowledge based on information hidden in event logs. Two important enablers for such analysis are powerful process mining techniques and the omnipresence of event logs in today's information systems. Most information systems supporting (structured) business processes (e.g. ERP, CRM, and workflow systems) record events in some form (e.g. transaction logs, audit trails, and database tables). Process mining techniques use event logs for all kinds of analysis, e.g., auditing, performance analysis, process discovery, etc. Although current process mining techniques/tools are quite mature, the analysis they support is somewhat limited because it is purely based on labels in logs. This means that these techniques cannot benefit from the actual semantics behind these labels which could cater for more accurate and robust analysis techniques. Existing analysis techniques are purely syntax oriented, i.e., much time is spent on filtering, translating, interpreting, and modifying event logs given a particular question. This paper presents the core building blocks necessary to enable semantic process mining techniques/tools. Although the approach is highly generic, we focus on a particular process mining technique and show how this technique can be extended and implemented in the ProM framework tool
    corecore