220,104 research outputs found

    A Process Modelling Framework Based on Point Interval Temporal Logic with an Application to Modelling Patient Flows

    Get PDF
    This thesis considers an application of a temporal theory to describe and model the patient journey in the hospital accident and emergency (A&E) department. The aim is to introduce a generic but dynamic method applied to any setting, including healthcare. Constructing a consistent process model can be instrumental in streamlining healthcare issues. Current process modelling techniques used in healthcare such as flowcharts, unified modelling language activity diagram (UML AD), and business process modelling notation (BPMN) are intuitive and imprecise. They cannot fully capture the complexities of the types of activities and the full extent of temporal constraints to an extent where one could reason about the flows. Formal approaches such as Petri have also been reviewed to investigate their applicability to the healthcare domain to model processes. Additionally, to schedule patient flows, current modelling standards do not offer any formal mechanism, so healthcare relies on critical path method (CPM) and program evaluation review technique (PERT), that also have limitations, i.e. finish-start barrier. It is imperative to specify the temporal constraints between the start and/or end of a process, e.g., the beginning of a process A precedes the start (or end) of a process B. However, these approaches failed to provide us with a mechanism for handling these temporal situations. If provided, a formal representation can assist in effective knowledge representation and quality enhancement concerning a process. Also, it would help in uncovering complexities of a system and assist in modelling it in a consistent way which is not possible with the existing modelling techniques. The above issues are addressed in this thesis by proposing a framework that would provide a knowledge base to model patient flows for accurate representation based on point interval temporal logic (PITL) that treats point and interval as primitives. These objects would constitute the knowledge base for the formal description of a system. With the aid of the inference mechanism of the temporal theory presented here, exhaustive temporal constraints derived from the proposed axiomatic system’ components serves as a knowledge base. The proposed methodological framework would adopt a model-theoretic approach in which a theory is developed and considered as a model while the corresponding instance is considered as its application. Using this approach would assist in identifying core components of the system and their precise operation representing a real-life domain deemed suitable to the process modelling issues specified in this thesis. Thus, I have evaluated the modelling standards for their most-used terminologies and constructs to identify their key components. It will also assist in the generalisation of the critical terms (of process modelling standards) based on their ontology. A set of generalised terms proposed would serve as an enumeration of the theory and subsume the core modelling elements of the process modelling standards. The catalogue presents a knowledge base for the business and healthcare domains, and its components are formally defined (semantics). Furthermore, a resolution theorem-proof is used to show the structural features of the theory (model) to establish it is sound and complete. After establishing that the theory is sound and complete, the next step is to provide the instantiation of the theory. This is achieved by mapping the core components of the theory to their corresponding instances. Additionally, a formal graphical tool termed as point graph (PG) is used to visualise the cases of the proposed axiomatic system. PG facilitates in modelling, and scheduling patient flows and enables analysing existing models for possible inaccuracies and inconsistencies supported by a reasoning mechanism based on PITL. Following that, a transformation is developed to map the core modelling components of the standards into the extended PG (PG*) based on the semantics presented by the axiomatic system. A real-life case (from the King’s College hospital accident and emergency (A&E) department’s trauma patient pathway) is considered to validate the framework. It is divided into three patient flows to depict the journey of a patient with significant trauma, arriving at A&E, undergoing a procedure and subsequently discharged. Their staff relied upon the UML-AD and BPMN to model the patient flows. An evaluation of their representation is presented to show the shortfalls of the modelling standards to model patient flows. The last step is to model these patient flows using the developed approach, which is supported by enhanced reasoning and scheduling

    Modelling the pacemaker in event-B: towards methodology for reuse

    No full text
    The cardiac pacemaker is one of the system modelling problems posed to the Formal Methods community by the {\it Grand Challenge for Dependable Systems Evolution} \cite{JOW:06}. The pacemaker is an intricate safety-critical system that supports and moderates the dysfunctional heart's intrinsic electrical control system. This paper focusses on (i) the problem (requirements) domain specification and its mapping to solution (implementation) domain models, (ii) the significant commonality of behaviour between its many operating modes, emphasising the potential for reuse, and (iii) development and verification of models.We introduce the problem and model three of the operating modes in the problem domain using a state machine notation. We then map each of these models into a solution domain state machine notation, designed as shorthand for a refinement-based solution domain development in the Event-B formal language and its RODIN toolki

    Model-driven performance evaluation for service engineering

    Get PDF
    Service engineering and service-oriented architecture as an integration and platform technology is a recent approach to software systems integration. Software quality aspects such as performance are of central importance for the integration of heterogeneous, distributed service-based systems. Empirical performance evaluation is a process of measuring and calculating performance metrics of the implemented software. We present an approach for the empirical, model-based performance evaluation of services and service compositions in the context of model-driven service engineering. Temporal databases theory is utilised for the empirical performance evaluation of model-driven developed service systems

    Quality-aware model-driven service engineering

    Get PDF
    Service engineering and service-oriented architecture as an integration and platform technology is a recent approach to software systems integration. Quality aspects ranging from interoperability to maintainability to performance are of central importance for the integration of heterogeneous, distributed service-based systems. Architecture models can substantially influence quality attributes of the implemented software systems. Besides the benefits of explicit architectures on maintainability and reuse, architectural constraints such as styles, reference architectures and architectural patterns can influence observable software properties such as performance. Empirical performance evaluation is a process of measuring and evaluating the performance of implemented software. We present an approach for addressing the quality of services and service-based systems at the model-level in the context of model-driven service engineering. The focus on architecture-level models is a consequence of the black-box character of services

    Reconciling a component and process view

    Full text link
    In many cases we need to represent on the same abstraction level not only system components but also processes within the system, and if for both representation different frameworks are used, the system model becomes hard to read and to understand. We suggest a solution how to cover this gap and to reconcile component and process views on system representation: a formal framework that gives the advantage of solving design problems for large-scale component systems.Comment: Preprint, 7th International Workshop on Modeling in Software Engineering (MiSE) at ICSE 201

    Semantic-based decision support for remote care of dementia patients

    Get PDF
    This paper investigates the challenges in developing a semantic-based Dementia Care Decision Support System based on the non-intrusive monitoring of the patient's behaviour. Semantic-based approaches are well suited for modelling context-aware scenarios similar to Dementia care systems, where the patient's dynamic behaviour observations (occupants movement, equipment use) need to be analysed against the semantic knowledge about the patient's condition (illness history, medical advice, known symptoms) in an integrated knowledgebase. However, our research findings establish that the ability of semantic technologies to reason upon the complex interrelated events emanating from the behaviour monitoring sensors to infer knowledge assisting medical advice represents a major challenge. We attempt to address this problem by introducing a new approach that relies on propositional calculus modelling to segregate complex events that are amenable for semantic reasoning from events that require pre-processing outside the semantic engine before they can be reasoned upon. The event pre-processing activity also controls the timing of triggering the reasoning process in order to further improve the efficiency of the inference process. Using regression analysis, we evaluate the response-time as the number of monitored patients increases and conclude that the incurred overhead on the response time of the prototype decision support systems remains tolerable

    An Improved Model for Relativistic Solar Proton Acceleration applied to the 2005 January 20 and Earlier Events

    Full text link
    This paper presents results on modelling the ground level response of the higher energy protons for the 2005 January 20 ground level enhancement (GLE). This event, known as GLE 69, produced the highest intensity of relativistic solar particles since the famous event on 1956 February 23. The location of recent X-ray and gamma-ray emission (N14 W61) was near to Sun-Earth connecting magnetic field lines, thus providing the opportunity to directly observe the acceleration source from Earth. We restrict our analysis to protons of energy greater than 450 MeV to avoid complications arising from transport processes that can affect the propagation of low energy protons. In light of this revised approach we have reinvestigated two previous GLEs: those of 2000 July 14 (GLE 59) and 2001 April 15 (GLE 60). Within the limitations of the spectral forms employed, we find that from the peak (06:55 UT) to the decline (07:30 UT) phases of GLE 69, neutron monitor observations from 450 MeV to 10 GeV are best fitted by the Gallegos-Cruz & Perez-Peraza stochastic acceleration model. In contrast, the Ellison & Ramaty spectra did not fit the neutron monitor observations as well. This result suggests that for GLE 69, a stochastic process cannot be discounted as a mechanism for relativistic particle acceleration, particularly during the initial stages of this solar event. For GLE 59 we find evidence that more than one acceleration mechanism was present, consistent with both shock and stochastic acceleration processes dominating at different times of the event. For GLE 60 we find that Ellison & Ramaty spectra better represent the neutron monitor observations compared to stochastic acceleration spectra. The results for GLEs 59 and 60 are in agreement with our previous work.Comment: 42 pages, 10 figures, 10 tables, published in ApJ, August 200

    Audio-visual foreground extraction for event characterization

    Get PDF
    This paper presents a new method able to integrate audio and visual information for scene analysis in a typical surveillance scenario, using only one camera and one monaural microphone. Visual information is analyzed by a standard visual background/foreground (BG/FG) modelling module, enhanced with a novelty detection stage, and coupled with an audio BG/FG modelling scheme. The audiovisual association is performed on-line, by exploiting the concept of synchrony. Experimental tests carrying out classification and clustering of events show all the potentialities of the proposed approach, also in comparison with the results obtained by using the single modalities
    corecore