88 research outputs found

    Completing the Is-a Structure in Description Logics Ontologies

    Full text link

    A Language-Independent and Formal Approach to Pattern-Based Modelling with Support for Composition and Analysis

    Get PDF
    Context: Patterns are used in different disciplines as a way to record expert knowledge for problem solving in specific areas. Their systematic use in Software Engineering promotes quality, standardization, reusability and maintainability of software artefacts. The full realisation of their power is however hindered by the lack of a standard formalization of the notion of pattern. Objective: Our goal is to provide a language-independent formalization of the notion of pattern, so that it allows its application to different modelling languages and tools, as well as generic methods to enable pattern discovery, instantiation, composition, and conflict analysis. Method: For this purpose, we present a new visual and formal, language-independent approach to the specification of patterns. The approach is formulated in a general way, based on graphs and category theory, and allows the specification of patterns in terms of (nested) variable submodels, constraints on their allowed variance, and inter-pattern synchronization across several diagrams (e.g. class and sequence diagrams for UML design patterns). Results: We provide a formal notion of pattern satisfaction by models and propose mechanisms to suggest model transformations so that models become consistent with the patterns. We define methods for pattern composition, and conflict analysis. We illustrate our proposal on UML design patterns, and discuss its generality and applicability on different types of patterns, e.g. workflow patterns, enterprise integration patterns and interaction patterns. Conclusion: The approach has proven to be powerful enough to formalize patterns from different domains, providing methods to analyse conflicts and dependencies that usually are expressed only in textual form. Its language independence makes it suitable for integration in meta-modelling tools and for use in Model-Driven Engineering.This work has been supported by the Visiting Professor Programmes of ‘‘Sapienza” University of Rome and its Department of Computer Science, the R&D program of the Community of Madrid (S2009/TIC-1650, project ‘‘e-Madrid”), the CAM-UC3M project ‘‘EXPLORE” (CCG08-UC3M/TIC-4487), as well as by the Spanish Ministry of Science and Innovation, under project ‘‘METEORIC” (TIN2008-02081), and mobility Grants JC2009-00015 and PR2009-0019.Publicad

    Blending under deconstruction

    Get PDF
    n/

    Tools and techniques for analysing the impact of information security

    Get PDF
    PhD ThesisThe discipline of information security is employed by organisations to protect the confidentiality, integrity and availability of information, often communicated in the form of information security policies. A policy expresses rules, constraints and procedures to guard against adversarial threats and reduce risk by instigating desired and secure behaviour of those people interacting with information legitimately. To keep aligned with a dynamic threat landscape, evolving business requirements, regulation updates, and new technologies a policy must undergo periodic review and change. Chief Information Security Officers (CISOs) are the main decision makers on information security policies within an organisation. Making informed policy modifications involves analysing and therefore predicting the impact of those changes on the success rate of business processes often expressed as workflows. Security brings an added burden to completing a workflow. Adding a new security constraint may reduce success rate or even eliminate it if a workflow is always forced to terminate early. This can increase the chances of employees bypassing or violating a security policy. Removing an existing security constraint may increase success rate but may may also increase the risk to security. A lack of suitably aimed impact analysis tools and methodologies for CISOs means impact analysis is currently a somewhat manual and ambiguous procedure. Analysis can be overwhelming, time consuming, error prone, and yield unclear results, especially when workflows are complex, have a large workforce, and diverse security requirements. This thesis considers the provision of tools and more formal techniques specific to CISOs to help them analyse the impact modifying a security policy has on the success rate of a workflow. More precisely, these tools and techniques have been designed to efficiently compare the impact between two versions of a security policy applied to the same workflow, one before, the other after a policy modification. This work focuses on two specific types of security impact analysis. The first is quantitative in nature, providing a measure of success rate for a security constrained workflow which must be executed by employees who may be absent at runtime. This work considers quantifying workflow resiliency which indicates a workflow’s expected success rate assuming the availability of employees to be probabilistic. New aspects of quantitative resiliency are introduced in the form of workflow metrics, and risk management techniques to manage workflows that must work with a resiliency below acceptable levels. Defining these risk management techniques has led to exploring the reduction of resiliency computation time and analysing resiliency in workflows with choice. The second area of focus is more qualitative, in terms of facilitating analysis of how people are likely to behave in response to security and how that behaviour can impact the success rate of a workflow at a task level. Large amounts of information from disparate sources exists on human behavioural factors in a security setting which can be aligned with security standards and structured within a single ontology to form a knowledge base. Consultations with two CISOs have been conducted, whose responses have driven the implementation of two new tools, one graphical, the other Web-oriented allowing CISOs and human factors experts to record and incorporate their knowledge directly within an ontology. The ontology can be used by CISOs to assess the potential impact of changes made to a security policy and help devise behavioural controls to manage that impact. The two consulted CISOs have also carried out an evaluation of the Web-oriented tool. vii

    A Process Modelling Framework Based on Point Interval Temporal Logic with an Application to Modelling Patient Flows

    Get PDF
    This thesis considers an application of a temporal theory to describe and model the patient journey in the hospital accident and emergency (A&E) department. The aim is to introduce a generic but dynamic method applied to any setting, including healthcare. Constructing a consistent process model can be instrumental in streamlining healthcare issues. Current process modelling techniques used in healthcare such as flowcharts, unified modelling language activity diagram (UML AD), and business process modelling notation (BPMN) are intuitive and imprecise. They cannot fully capture the complexities of the types of activities and the full extent of temporal constraints to an extent where one could reason about the flows. Formal approaches such as Petri have also been reviewed to investigate their applicability to the healthcare domain to model processes. Additionally, to schedule patient flows, current modelling standards do not offer any formal mechanism, so healthcare relies on critical path method (CPM) and program evaluation review technique (PERT), that also have limitations, i.e. finish-start barrier. It is imperative to specify the temporal constraints between the start and/or end of a process, e.g., the beginning of a process A precedes the start (or end) of a process B. However, these approaches failed to provide us with a mechanism for handling these temporal situations. If provided, a formal representation can assist in effective knowledge representation and quality enhancement concerning a process. Also, it would help in uncovering complexities of a system and assist in modelling it in a consistent way which is not possible with the existing modelling techniques. The above issues are addressed in this thesis by proposing a framework that would provide a knowledge base to model patient flows for accurate representation based on point interval temporal logic (PITL) that treats point and interval as primitives. These objects would constitute the knowledge base for the formal description of a system. With the aid of the inference mechanism of the temporal theory presented here, exhaustive temporal constraints derived from the proposed axiomatic system’ components serves as a knowledge base. The proposed methodological framework would adopt a model-theoretic approach in which a theory is developed and considered as a model while the corresponding instance is considered as its application. Using this approach would assist in identifying core components of the system and their precise operation representing a real-life domain deemed suitable to the process modelling issues specified in this thesis. Thus, I have evaluated the modelling standards for their most-used terminologies and constructs to identify their key components. It will also assist in the generalisation of the critical terms (of process modelling standards) based on their ontology. A set of generalised terms proposed would serve as an enumeration of the theory and subsume the core modelling elements of the process modelling standards. The catalogue presents a knowledge base for the business and healthcare domains, and its components are formally defined (semantics). Furthermore, a resolution theorem-proof is used to show the structural features of the theory (model) to establish it is sound and complete. After establishing that the theory is sound and complete, the next step is to provide the instantiation of the theory. This is achieved by mapping the core components of the theory to their corresponding instances. Additionally, a formal graphical tool termed as point graph (PG) is used to visualise the cases of the proposed axiomatic system. PG facilitates in modelling, and scheduling patient flows and enables analysing existing models for possible inaccuracies and inconsistencies supported by a reasoning mechanism based on PITL. Following that, a transformation is developed to map the core modelling components of the standards into the extended PG (PG*) based on the semantics presented by the axiomatic system. A real-life case (from the King’s College hospital accident and emergency (A&E) department’s trauma patient pathway) is considered to validate the framework. It is divided into three patient flows to depict the journey of a patient with significant trauma, arriving at A&E, undergoing a procedure and subsequently discharged. Their staff relied upon the UML-AD and BPMN to model the patient flows. An evaluation of their representation is presented to show the shortfalls of the modelling standards to model patient flows. The last step is to model these patient flows using the developed approach, which is supported by enhanced reasoning and scheduling

    Business Policy Modeling and Enforcement in Relational Database Systems

    Get PDF
    Database systems maintain integrity of the stored information by ensuring that modifications to the database comply with constraints designed by the administrators. As the number of users and applications sharing a common database increases, so does the complexity of the set of constraints that originate from higher level business processes. The lack of a systematic mechanism for integrating and reasoning about a diverse set of evolving and potentially interfering policies manifested as database level constraints makes corporate policy management within relational systems a chaotic process. In this thesis we present a systematic method of mapping a broad set of process centric business policies onto database level constraints. We exploit the observation that the state of a database represents the union of all the states of every ongoing business process and thus establish a bijective relationship between progression in individual business processes and changes in the database state space. We propose graphical notations that are equivalent to integrity constraints specified in linear temporal logic of the past. Furthermore we demonstrate how this notation can accommodate a wide array of workflow patterns, can allow for multiple policy makers to implement their own process centric constraints independently using their own logical policy models, and can model check these constraints within the database system to detect potential conflicting constraints across several different business processes. A major contribution of this thesis is that it bridges several different areas of research including database systems, temporal logics, model checking, and business workflow/policy management to propose an accessible method of integrating, enforcing, and reasoning about the consequences of process-centric constraints embedded in database systems. As a result, the task of ensuring that a database continuously complies with evolving business rules governed by hundreds of processes, which is traditionally handled by an army of database programmers regularly updating triggers and batch procedures, is made easier, more manageable, and more predictable

    Next-generation information systems for genomics

    Get PDF
    NIH Grant no. HG00739The advent of next-generation sequencing technologies is transforming biology by enabling individual researchers to sequence the genomes of individual organisms or cells on a massive scale. In order to realize the translational potential of this technology we will need advanced information systems to integrate and interpret this deluge of data. These systems must be capable of extracting the location and function of genes and biological features from genomic data, requiring the coordinated parallel execution of multiple bioinformatics analyses and intelligent synthesis of the results. The resulting databases must be structured to allow complex biological knowledge to be recorded in a computable way, which requires the development of logic-based knowledge structures called ontologies. To visualise and manipulate the results, new graphical interfaces and knowledge acquisition tools are required. Finally, to help understand complex disease processes, these information systems must be equipped with the capability to integrate and make inferences over multiple data sets derived from numerous sources. RESULTS: Here I describe research, design and implementation of some of the components of such a next-generation information system. I first describe the automated pipeline system used for the annotation of the Drosophila genome, and the application of this system in genomic research. This was succeeded by the development of a flexible graphoriented database system called Chado, which relies on the use of ontologies for structuring data and knowledge. I also describe research to develop, restructure and enhance a number of biological ontologies, adding a layer of logical semantics that increases the computability of these key knowledge sources. The resulting database and ontology collection can be accessed through a suite of tools. Finally I describe how the combination of genome analysis, ontology-based database representation and powerful tools can be combined in order to make inferences about genotype-phenotype relationships within and across species. CONCLUSION: The large volumes of complex data generated by high-throughput genomic and systems biology technology threatens to overwhelm us, unless we can devise better computing tools to assist us with its analysis. Ontologies are key technologies, but many existing ontologies are not interoperable or lack features that make them computable. Here I have shown how concerted ontology, tool and database development can be applied to make inferences of value to translational research

    A Logic-Based Framework for Web Access Control Policies

    Get PDF
    With the widespread use of web services, there is a need for adequate security and privacy support to protect the sensitive information these services could provide. As a result, there has been a great interest in access control policy languages which accommodate large, open, distributed and heterogeneous environments like the Web. XACML has emerged as a popular access control language, but because of its rich expressiveness and informal semantics, it suffers from a) a lack of understanding of its formal properties, and b) a lack of automated, compile-time services that can detect errors in expressive, distributed and heterogeneous policies. In this dissertation, I present a logic-based framework for XACML that addresses the above issues. One component of the framework is a Datalog-based mapping for XACML v3.0 that provides a theoretical foundation for the language, namely: a concise logic-based semantics and complexity results for full XACML and various fragments. Additionally, my mapping discovers close relationships between XACML and other logic based languages such as the Flexible Authorization Framework. The second component of this framework provides a practical foundation for static analysis of expressive XACML policies. The analysis services detect semantic errors or differences between policies before they are deployed. To provide these services, I present a mapping from XACML to the Web Ontology Language (OWL), which is the standardized language for representing the semantics of information on the Web. In particular, I focus on the OWL-DL sub-language, which is a logic-based fragment of OWL. Finally, to demonstrate the practicality of using OWL-DL reasoners as policy analyzers, I have implemented an OWL-based XACML analyzer and performed extensive empirical evaluation using both real world and synthetic policy sets
    • 

    corecore