3,384 research outputs found

    A Process Modelling Framework Based on Point Interval Temporal Logic with an Application to Modelling Patient Flows

    Get PDF
    This thesis considers an application of a temporal theory to describe and model the patient journey in the hospital accident and emergency (A&E) department. The aim is to introduce a generic but dynamic method applied to any setting, including healthcare. Constructing a consistent process model can be instrumental in streamlining healthcare issues. Current process modelling techniques used in healthcare such as flowcharts, unified modelling language activity diagram (UML AD), and business process modelling notation (BPMN) are intuitive and imprecise. They cannot fully capture the complexities of the types of activities and the full extent of temporal constraints to an extent where one could reason about the flows. Formal approaches such as Petri have also been reviewed to investigate their applicability to the healthcare domain to model processes. Additionally, to schedule patient flows, current modelling standards do not offer any formal mechanism, so healthcare relies on critical path method (CPM) and program evaluation review technique (PERT), that also have limitations, i.e. finish-start barrier. It is imperative to specify the temporal constraints between the start and/or end of a process, e.g., the beginning of a process A precedes the start (or end) of a process B. However, these approaches failed to provide us with a mechanism for handling these temporal situations. If provided, a formal representation can assist in effective knowledge representation and quality enhancement concerning a process. Also, it would help in uncovering complexities of a system and assist in modelling it in a consistent way which is not possible with the existing modelling techniques. The above issues are addressed in this thesis by proposing a framework that would provide a knowledge base to model patient flows for accurate representation based on point interval temporal logic (PITL) that treats point and interval as primitives. These objects would constitute the knowledge base for the formal description of a system. With the aid of the inference mechanism of the temporal theory presented here, exhaustive temporal constraints derived from the proposed axiomatic system’ components serves as a knowledge base. The proposed methodological framework would adopt a model-theoretic approach in which a theory is developed and considered as a model while the corresponding instance is considered as its application. Using this approach would assist in identifying core components of the system and their precise operation representing a real-life domain deemed suitable to the process modelling issues specified in this thesis. Thus, I have evaluated the modelling standards for their most-used terminologies and constructs to identify their key components. It will also assist in the generalisation of the critical terms (of process modelling standards) based on their ontology. A set of generalised terms proposed would serve as an enumeration of the theory and subsume the core modelling elements of the process modelling standards. The catalogue presents a knowledge base for the business and healthcare domains, and its components are formally defined (semantics). Furthermore, a resolution theorem-proof is used to show the structural features of the theory (model) to establish it is sound and complete. After establishing that the theory is sound and complete, the next step is to provide the instantiation of the theory. This is achieved by mapping the core components of the theory to their corresponding instances. Additionally, a formal graphical tool termed as point graph (PG) is used to visualise the cases of the proposed axiomatic system. PG facilitates in modelling, and scheduling patient flows and enables analysing existing models for possible inaccuracies and inconsistencies supported by a reasoning mechanism based on PITL. Following that, a transformation is developed to map the core modelling components of the standards into the extended PG (PG*) based on the semantics presented by the axiomatic system. A real-life case (from the King’s College hospital accident and emergency (A&E) department’s trauma patient pathway) is considered to validate the framework. It is divided into three patient flows to depict the journey of a patient with significant trauma, arriving at A&E, undergoing a procedure and subsequently discharged. Their staff relied upon the UML-AD and BPMN to model the patient flows. An evaluation of their representation is presented to show the shortfalls of the modelling standards to model patient flows. The last step is to model these patient flows using the developed approach, which is supported by enhanced reasoning and scheduling

    The INCF Digital Atlasing Program: Report on Digital Atlasing Standards in the Rodent Brain

    Get PDF
    The goal of the INCF Digital Atlasing Program is to provide the vision and direction necessary to make the rapidly growing collection of multidimensional data of the rodent brain (images, gene expression, etc.) widely accessible and usable to the international research community. This Digital Brain Atlasing Standards Task Force was formed in May 2008 to investigate the state of rodent brain digital atlasing, and formulate standards, guidelines, and policy recommendations.

Our first objective has been the preparation of a detailed document that includes the vision and specific description of an infrastructure, systems and methods capable of serving the scientific goals of the community, as well as practical issues for achieving
the goals. This report builds on the 1st INCF Workshop on Mouse and Rat Brain Digital Atlasing Systems (Boline et al., 2007, _Nature Preceedings_, doi:10.1038/npre.2007.1046.1) and includes a more detailed analysis of both the current state and desired state of digital atlasing along with specific recommendations for achieving these goals

    An ontology co-design method for the co-creation of a continuous care ontology

    Get PDF
    Ontology engineering methodologies tend to emphasize the role of the knowledge engineer or require a very active role of domain experts. In this paper, a participatory ontology engineering method is described that holds the middle ground between these two 'extremes'. After thorough ethnographic research, an interdisciplinary group of domain experts closely interacted with ontology engineers and social scientists in a series of workshops. Once a preliminary ontology was developed, a dynamic care request system was built using the ontology. Additional workshops were organized involving a broader group of domain experts to ensure the applicability of the ontology across continuous care settings. The proposed method successfully actively engaged domain experts in constructing the ontology, without overburdening them. Its applicability is illustrated by presenting the co-created continuous care ontology. The lessons learned during the design and execution of the approach are also presented

    Towards defining semantic foundations for purpose-based privacy policies

    Get PDF
    We define a semantic model for purpose, based on which purpose-based privacy policies can be meaningfully expressed and enforced in a business system. The model is based on the intuition that the purpose of an action is determined by its situation among other inter-related actions. Actions and their relationships can be modeled in the form of an action graph which is based on the business processes in a system. Accordingly, a modal logic and the corresponding model checking algorithm are developed for formal expression of purpose-based policies and verifying whether a particular system complies with them. It is also shown through various examples, how various typical purpose-based policies as well as some new policy types can be expressed and checked using our model

    Knowledge-Intensive Processes: Characteristics, Requirements and Analysis of Contemporary Approaches

    Get PDF
    Engineering of knowledge-intensive processes (KiPs) is far from being mastered, since they are genuinely knowledge- and data-centric, and require substantial flexibility, at both design- and run-time. In this work, starting from a scientific literature analysis in the area of KiPs and from three real-world domains and application scenarios, we provide a precise characterization of KiPs. Furthermore, we devise some general requirements related to KiPs management and execution. Such requirements contribute to the definition of an evaluation framework to assess current system support for KiPs. To this end, we present a critical analysis on a number of existing process-oriented approaches by discussing their efficacy against the requirements

    Compliance of Semantic Constraints - A Requirements Analysis for Process Management Systems

    Get PDF
    Key to the use of process management systems (PrMS) in practice is their ability to facilitate the implementation, execution, and adaptation of business processes while still being able to ensure error-free process executions. Mechanisms have been developed to prevent errors at the syntactic level such as deadlocks. In many application domains, processes often have to comply with business level rules and policies (i.e., semantic constraints). Hence, in order to ensure error-free executions at the semantic level, PrMS need certain control mechanisms for validating and ensuring the compliance with semantic constraints throughout the process lifecycle. In this paper, we discuss fundamental requirements for a comprehensive support of semantic constraints in PrMS. Moreover, we provide a survey on existing approaches and discuss to what extent they meet the requirements and which challenges still have to be tackled. Finally, we show how the challenge of life time compliance can be dealt with by integrating design time and runtime process validation

    Research in Business Process Management: A bibliometric analysis

    Get PDF
    It contains several growing subtopics such as process mining, process flexibility and process compliance. BPM is also highly relevant for numerous related fields, such as Business Intelligence, ERP systems or Knowledge Management. The growing number of publications and the variety of topics in BPM make it useful to apply bibliometric methods on this scientific field. With bibliometric methods, topical clusters, essential authors and the relationships between them can be discovered. In this work, the BibTechMon software from the Austrian Institute of Technology is utilized to perform the bibliometric analyses. As a novelty for the work with BibTechMon, data from Google Scholar is used as the basis of the analyses. The nature of Google Scholar data differs significantly from the data of other scientific databases. These differences lead to changes on how the bibliometric analyses can be performed. After these changes have been assessed, several bibliometric analyses in the BPM field and related fields are performed. As a result of these analyses, diverse topical clusters in BPM and its related fields could be discovered. Additionally, important authors for each cluster and for the BPM field as a whole were determined. In order to evaluate the results of the bibliometric analyses, I conducted an interview on BPM with Professor Reichert, who is an active researcher in the field. Subsequently, his statements are compared with the results of the bibliometric analyses and the match between the bibliometric analyses and his statements is assessed
    corecore