377 research outputs found

    A formal modeling approach to ontology engineering

    Get PDF
    Ph.DDOCTOR OF PHILOSOPH

    Architectural Refinement in HETS

    Get PDF
    The main objective of this work is to bring a number of improvements to the Heterogeneous Tool Set HETS, both from a theoretical and an implementation point of view. In the first part of the thesis we present a number of recent extensions of the tool, among which declarative specifications of logics, generalized theoroidal comorphisms, heterogeneous colimits and integration of the logic of the term rewriting system Maude. In the second part we concentrate on the CASL architectural refinement language, that we equip with a notion of refinement tree and with calculi for checking correctness and consistency of refinements. Soundness and completeness of these calculi is also investigated. Finally, we present the integration of the VSE refinement method in HETS as an institution comorphism. Thus, the proof manangement component of HETS remains unmodified

    Reasoning for the description logic ALC with link keys

    Get PDF
    Data interlinking is a critical task for widening and enhancing linked open data. One way to tackle data interlinking is to use link keys, which generalise keys to the case of two RDF datasets described using different ontologies. Link keys specify pairs of properties to compare for finding same-as links between instances of two classes of two different datasets. Hence, they can be used for finding links. Link keys can also be considered as logical axioms just like keys, ontologies and ontology alignments. We introduce the logic ALC+LK extending the description logic ALC with link keys. It may be used to reason and infer entailed link keys that may be more useful for a particular data interlinking task. We show that link key entailment can be reduced to consistency checking without introducing the negation of link keys. For deciding the consistency of an ALC+LK ontology, we introduce a new tableau-based algorithm. Contrary to the classical ones, the completion rules concerning link keys apply to pairs of individuals not directly related. We show that this algorithm is sound, complete and always terminates

    Query Rewriting and Optimization for Ontological Databases

    Full text link
    Ontological queries are evaluated against a knowledge base consisting of an extensional database and an ontology (i.e., a set of logical assertions and constraints which derive new intensional knowledge from the extensional database), rather than directly on the extensional database. The evaluation and optimization of such queries is an intriguing new problem for database research. In this paper, we discuss two important aspects of this problem: query rewriting and query optimization. Query rewriting consists of the compilation of an ontological query into an equivalent first-order query against the underlying extensional database. We present a novel query rewriting algorithm for rather general types of ontological constraints which is well-suited for practical implementations. In particular, we show how a conjunctive query against a knowledge base, expressed using linear and sticky existential rules, that is, members of the recently introduced Datalog+/- family of ontology languages, can be compiled into a union of conjunctive queries (UCQ) against the underlying database. Ontological query optimization, in this context, attempts to improve this rewriting process so to produce possibly small and cost-effective UCQ rewritings for an input query.Comment: arXiv admin note: text overlap with arXiv:1312.5914 by other author

    A Formal Account of the Open Provenance Model

    Get PDF
    On the Web, where resources such as documents and data are published, shared, transformed, and republished, provenance is a crucial piece of metadata that would allow users to place their trust in the resources they access. The Open Provenance Model (OPM) is a community data model for provenance that is designed to facilitate the meaningful interchange of provenance information between systems. Underpinning OPM is a notion of directed graph, where nodes represent data products and processes involved in past computations, and edges represent dependencies between them; it is complemented by graphical inference rules allowing new dependencies to be derived. Until now, however, the OPM model was a purely syntactical endeavor. The present paper extends OPM graphs with an explicit distinction between precise and imprecise edges. Then a formal semantics for the thus enriched OPM graphs is proposed, by viewing OPM graphs as temporal theories on the temporal events represented in the graph. The original OPM inference rules are scrutinized in view of the semantics and found to be sound but incomplete. An extended set of graphical rules is provided and proved to be complete for inference. The paper concludes with applications of the formal semantics to inferencing in OPM graphs, operators on OPM graphs, and a formal notion of refinement among OPM graphs

    Horn fragments of the Halpern-Shoham Interval Temporal Logic

    Get PDF
    We investigate the satisfiability problem for Horn fragments of the Halpern-Shoham interval temporal logic depending on the type (box or diamond) of the interval modal operators, the type of the underlying linear order (discrete or dense), and the type of semantics for the interval relations (reflexive or irreflexive). For example, we show that satisfiability of Horn formulas with diamonds is undecidable for any type of linear orders and semantics. On the contrary, satisfiability of Horn formulas with boxes is tractable over both discrete and dense orders under the reflexive semantics and over dense orders under the irreflexive semantics but becomes undecidable over discrete orders under the irreflexive semantics. Satisfiability of binary Horn formulas with both boxes and diamonds is always undecidable under the irreflexive semantics

    Context-Aware Modeling Using Semantic Web and Z Notation

    Get PDF
    Surveys in user context modeling have shown that the semantic web is one of the promising approach to represent and structure the contextual information captured from user’s surrounding environment in a context-aware application. A benefit of using semantic web language is that it enables application to reason user contextual information in order to get the knowledge of user’s behavior. However, regarding its notation format, semantic web is suitable for implementation level or to be consumed by application run-time. Context-aware application is a part of distributed computing system. In distributed computing system, the language used for specification should be distinguished from the implementation / run-time purpose. This is known as separation of modeling language. Regarding the context-aware application, for those who are concerned with specification of context modeling, the language that is used for specification should also be distinguished from the implementation one. This thesis aims at proposing the use of formal specification technique to develop a generic context ontology model of user’s behavior at the Computer and Information Sciences Department, Universiti Teknologi PETRONAS. Initially, the context ontology was written in OWL semantic web language. The further process is mapping onto a formal specification language, i.e. onto Z notation. As a result, specification of context ontology and its consistency checking have been developed and verified beyond the semantic web language environment. An inconsistency of context model has been detected during the verification of Z model, which cannot be revealed by current OWL DL reasoner. The context-aware designers might benefit from the formal specification of context ontology, where the designers could fully use formal verification technique to check the correctness of context ontology. Thus, the modeling approach in this thesis has shown that it could complement the context ontology development process, where the checking and refinement are performed beyond the semantic web reasone

    An Abstract Formal Basis for Digital Crowds

    Get PDF
    Crowdsourcing, together with its related approaches, has become very popular in recent years. All crowdsourcing processes involve the participation of a digital crowd, a large number of people that access a single Internet platform or shared service. In this paper we explore the possibility of applying formal methods, typically used for the verification of software and hardware systems, in analysing the behaviour of a digital crowd. More precisely, we provide a formal description language for specifying digital crowds. We represent digital crowds in which the agents do not directly communicate with each other. We further show how this specification can provide the basis for sophisticated formal methods, in particular formal verification.Comment: 32 pages, 4 figure
    corecore