3,245 research outputs found

    A modal logic for reasoning on consistency and completeness of regulations

    Get PDF
    In this paper, we deal with regulations that may exist in multi-agent systems in order to regulate agent behaviour and we discuss two properties of regulations, that is consistency and completeness. After defining what consistency and completeness mean, we propose a way to consistently complete incomplete regulations. In this contribution, we extend previous works and we consider that regulations are expressed in a first order modal deontic logic

    Distributed First Order Logic

    Full text link
    Distributed First Order Logic (DFOL) has been introduced more than ten years ago with the purpose of formalising distributed knowledge-based systems, where knowledge about heterogeneous domains is scattered into a set of interconnected modules. DFOL formalises the knowledge contained in each module by means of first-order theories, and the interconnections between modules by means of special inference rules called bridge rules. Despite their restricted form in the original DFOL formulation, bridge rules have influenced several works in the areas of heterogeneous knowledge integration, modular knowledge representation, and schema/ontology matching. This, in turn, has fostered extensions and modifications of the original DFOL that have never been systematically described and published. This paper tackles the lack of a comprehensive description of DFOL by providing a systematic account of a completely revised and extended version of the logic, together with a sound and complete axiomatisation of a general form of bridge rules based on Natural Deduction. The resulting DFOL framework is then proposed as a clear formal tool for the representation of and reasoning about distributed knowledge and bridge rules

    Ontology mapping: the state of the art

    No full text
    Ontology mapping is seen as a solution provider in today's landscape of ontology research. As the number of ontologies that are made publicly available and accessible on the Web increases steadily, so does the need for applications to use them. A single ontology is no longer enough to support the tasks envisaged by a distributed environment like the Semantic Web. Multiple ontologies need to be accessed from several applications. Mapping could provide a common layer from which several ontologies could be accessed and hence could exchange information in semantically sound manners. Developing such mapping has beeb the focus of a variety of works originating from diverse communities over a number of years. In this article we comprehensively review and present these works. We also provide insights on the pragmatics of ontology mapping and elaborate on a theoretical approach for defining ontology mapping

    Possibilistic Information Flow Control for Workflow Management Systems

    Full text link
    In workflows and business processes, there are often security requirements on both the data, i.e. confidentiality and integrity, and the process, e.g. separation of duty. Graphical notations exist for specifying both workflows and associated security requirements. We present an approach for formally verifying that a workflow satisfies such security requirements. For this purpose, we define the semantics of a workflow as a state-event system and formalise security properties in a trace-based way, i.e. on an abstract level without depending on details of enforcement mechanisms such as Role-Based Access Control (RBAC). This formal model then allows us to build upon well-known verification techniques for information flow control. We describe how a compositional verification methodology for possibilistic information flow can be adapted to verify that a specification of a distributed workflow management system satisfies security requirements on both data and processes.Comment: In Proceedings GraMSec 2014, arXiv:1404.163

    Initial Draft of a Possible Declarative Semantics for the Language

    Get PDF
    This article introduces a preliminary declarative semantics for a subset of the language Xcerpt (so-called grouping-stratifiable programs) in form of a classical (Tarski style) model theory, adapted to the specific requirements of Xcerpt’s constructs (e.g. the various aspects of incompleteness in query terms, grouping constructs in rule heads, etc.). Most importantly, the model theory uses term simulation as a replacement for term equality to handle incomplete term specifications, and an extended notion of substitutions in order to properly convey the semantics of grouping constructs. Based upon this model theory, a fixpoint semantics is also described, leading to a first notion of forward chaining evaluation of Xcerpt program

    An extended ontology-based context model and manipulation calculus for dynamic web service processes

    Get PDF
    Services are oered in an execution context that is determined by how a provider provisions the service and how the user consumes it. The need for more exibility requires the provisioning and consumption aspects to be addressed at runtime. We propose an ontology-based context model providing a framework for service provisioning and consumption aspects and techniques for managing context constraints for Web service processes where dynamic context concerns can be monitored and validated at service process run-time. We discuss the contextualization of dynamically relevant aspects of Web service processes as our main goal, i.e. capture aspects in an extended context model. The technical contributions of this paper are a context model ontology for dynamic service contexts and an operator calculus for integrated and coherent context manipulation, composition and reasoning. The context model ontology formalizes dynamic aspects of Web services and facilitates reasoning. We present the context ontology in terms of four core dimensions - functional, QoS, domain and platform - which are internally interconnected

    Towards the ontology-based consolidation of production-centric standards

    Get PDF
    Production-­centric international standards are intended to serve as an important route towards information sharing across manufacturing decision support systems. As a consequence of textual-­based definitions of concepts acknowledged within these standards, their inability to fully interoperate becomes an issue especially since a multitude of standards are required to cover the needs of extensive domains such as manufacturing industries. To help reinforce the current understanding to support the consolidation of production-­centric standards for improved information sharing, this article explores the specification of well-defined core concepts which can be used as a basis for capturing tailored semantic definitions. The potentials of two heavyweight ontological approaches, notably Common Logic (CL) and the Web Ontology Language (OWL) as candidates for the task, are also exposed. An important finding regarding these two methods is that while an OWL-­based approach shows capabilities towards applications which may require flexible hierarchies of concepts, a CL-­based method represents a favoured contender for scoped and facts-­driven manufacturing applications

    Object Histories as a Foundation for an Active OODB

    Get PDF
    Several links exist between active and temporal databases. These are summarised by the observation that rules are triggered by a specified evolution of the database. In this paper, we discuss the relation between active and temporal database using DEGAS, an object-based active database programming language. To achieve full active database functionality, a DEGAS object records its complete history. Hence, all data needed for a temporal database supporting a single temporal dimension is provided. Furthermore, the semantics of the active behaviour of DEGAS are defined straightforwardly in terms of the object history. Finally, we discuss the advantages and disadvantages of extending DEGAS with a second time dimension (to achieve full temporal functionality) from an active database perspective

    Computer support for protocol-based treatment of cancer

    Get PDF
    Cancer treatment is often carried out within protocol-based clinical trials. An oncology clinic may take part in many trials each of which requires data to be collected for monitoring efficacy and toxicity of treatment. Subsequently, this data is analysed statistically to evaluate clinical objectives of the trial. To be scientifically valid, such analysis must be based on data that is both complete and correct. This is one motivating factor for introducing computer support for trial management. Further motivation is provided by concern that treatment is consistent with the protocol and the well-being of the patient. The complexity of many protocols, the life-threatening nature of cancer and the toxicity of treatment side-effects emphasise the safety-critical nature of oncology. The OaSiS system provides decision support for the protocol-based treatment of cancer patients with emphasis on the safety aspects of the advice it gives. It offers a highly graphical interface, employs integrity constraint checking techniques from logic databases to monitor compliance with a protocol and is implemented in PROLOG. The paper describes the main features of OaSiS and indicates work in progress and planned. 1
    corecore