29,236 research outputs found
Recommended from our members
Technique for improving care integration models
Recent developments in technologies and improved life style have had a positive impact on prolonging human life contributing to the increasing elderly population. As a consequence, many countries (particularly developed ones) started to experience higher proportions of elderly people (over 65). This has consequently generated the need for care for the elderly that is necessitating the integration of health and social care to accommodate their complex needs. A number of modelling methods have been employed to assist those concerned to cope with health and social care but albeit separately. The literatures so far, identified several techniques that have been employed mostly to model the care integration. However, literatures also suggest that there are some challenges still persist when modelling integrated care. It can be argued that these techniques are not capable of handling the complexities associated with the requirements of integrated systems. This paper attempts to prove the reason why despite the fact that many models of integrated care have been developed, problems are still exist. Based on the literatures, the problems exist due to the unsuitable techniques used to model the IC systems as most of the developed models are using single technique. Therefore, new technique to improve the care integration model is suggested
Data-driven modelling of biological multi-scale processes
Biological processes involve a variety of spatial and temporal scales. A
holistic understanding of many biological processes therefore requires
multi-scale models which capture the relevant properties on all these scales.
In this manuscript we review mathematical modelling approaches used to describe
the individual spatial scales and how they are integrated into holistic models.
We discuss the relation between spatial and temporal scales and the implication
of that on multi-scale modelling. Based upon this overview over
state-of-the-art modelling approaches, we formulate key challenges in
mathematical and computational modelling of biological multi-scale and
multi-physics processes. In particular, we considered the availability of
analysis tools for multi-scale models and model-based multi-scale data
integration. We provide a compact review of methods for model-based data
integration and model-based hypothesis testing. Furthermore, novel approaches
and recent trends are discussed, including computation time reduction using
reduced order and surrogate models, which contribute to the solution of
inference problems. We conclude the manuscript by providing a few ideas for the
development of tailored multi-scale inference methods.Comment: This manuscript will appear in the Journal of Coupled Systems and
Multiscale Dynamics (American Scientific Publishers
Advances in computational modelling for personalised medicine after myocardial infarction
Myocardial infarction (MI) is a leading cause of premature morbidity and mortality worldwide. Determining which patients will experience heart failure and sudden cardiac death after an acute MI is notoriously difficult for clinicians. The extent of heart damage after an acute MI is informed by cardiac imaging, typically using echocardiography or sometimes, cardiac magnetic resonance (CMR). These scans provide complex data sets that are only partially exploited by clinicians in daily practice, implying potential for improved risk assessment. Computational modelling of left ventricular (LV) function can bridge the gap towards personalised medicine using cardiac imaging in patients with post-MI. Several novel biomechanical parameters have theoretical prognostic value and may be useful to reflect the biomechanical effects of novel preventive therapy for adverse remodelling post-MI. These parameters include myocardial contractility (regional and global), stiffness and stress. Further, the parameters can be delineated spatially to correspond with infarct pathology and the remote zone. While these parameters hold promise, there are challenges for translating MI modelling into clinical practice, including model uncertainty, validation and verification, as well as time-efficient processing. More research is needed to (1) simplify imaging with CMR in patients with post-MI, while preserving diagnostic accuracy and patient tolerance (2) to assess and validate novel biomechanical parameters against established prognostic biomarkers, such as LV ejection fraction and infarct size. Accessible software packages with minimal user interaction are also needed. Translating benefits to patients will be achieved through a multidisciplinary approach including clinicians, mathematicians, statisticians and industry partners
A Process Modelling Framework Based on Point Interval Temporal Logic with an Application to Modelling Patient Flows
This thesis considers an application of a temporal theory to describe and model the patient journey in the hospital accident and emergency (A&E) department. The aim is to introduce a generic but dynamic method applied to any setting, including healthcare. Constructing a consistent process model can be instrumental in streamlining healthcare issues. Current process modelling techniques used in healthcare such as flowcharts, unified modelling language activity diagram (UML AD), and business process modelling notation (BPMN) are intuitive and imprecise. They cannot fully capture the complexities of the types of activities and the full extent of temporal constraints to an extent where one could reason about the flows. Formal approaches such as Petri have also been reviewed to investigate their applicability to the healthcare domain to model processes.
Additionally, to schedule patient flows, current modelling standards do not offer any formal mechanism, so healthcare relies on critical path method (CPM) and program evaluation review technique (PERT), that also have limitations, i.e. finish-start barrier. It is imperative to specify the temporal constraints between the start and/or end of a process, e.g., the beginning of a process A precedes the start (or end) of a process B. However, these approaches failed to provide us with a mechanism for handling these temporal situations. If provided, a formal representation can assist in effective knowledge representation and quality enhancement concerning a process. Also, it would help in uncovering complexities of a system and assist in modelling it in a consistent way which is not possible with the existing modelling techniques.
The above issues are addressed in this thesis by proposing a framework that would provide a knowledge base to model patient flows for accurate representation based on point interval temporal logic (PITL) that treats point and interval as primitives. These objects would constitute the knowledge base for the formal description of a system. With the aid of the inference mechanism of the temporal theory presented here, exhaustive temporal constraints derived from the proposed axiomatic system’ components serves as a knowledge base.
The proposed methodological framework would adopt a model-theoretic approach in which a theory is developed and considered as a model while the corresponding instance is considered as its application. Using this approach would assist in identifying core components of the system and their precise operation representing a real-life domain deemed suitable to the process modelling issues specified in this thesis. Thus, I have evaluated the modelling standards for their most-used terminologies and constructs to identify their key components. It will also assist in the generalisation of the critical terms (of process modelling standards) based on their ontology. A set of generalised terms proposed would serve as an enumeration of the theory and subsume the core modelling elements of the process modelling standards. The catalogue presents a knowledge base for the business and healthcare domains, and its components are formally defined (semantics). Furthermore, a resolution theorem-proof is used to show the structural features of the theory (model) to establish it is sound and complete.
After establishing that the theory is sound and complete, the next step is to provide the instantiation of the theory. This is achieved by mapping the core components of the theory to their corresponding instances. Additionally, a formal graphical tool termed as point graph (PG) is used to visualise the cases of the proposed axiomatic system. PG facilitates in modelling, and scheduling patient flows and enables analysing existing models for possible inaccuracies and inconsistencies supported by a reasoning mechanism based on PITL. Following that, a transformation is developed to map the core modelling components of the standards into the extended PG (PG*) based on the semantics presented by the axiomatic system.
A real-life case (from the King’s College hospital accident and emergency (A&E) department’s trauma patient pathway) is considered to validate the framework. It is divided into three patient flows to depict the journey of a patient with significant trauma, arriving at A&E, undergoing a procedure and subsequently discharged. Their staff relied upon the UML-AD and BPMN to model the patient flows. An evaluation of their representation is presented to show the shortfalls of the modelling standards to model patient flows. The last step is to model these patient flows using the developed approach, which is supported by enhanced reasoning and scheduling
Recommended from our members
A survey of simulation techniques in commerce and defence
Despite the developments in Modelling and Simulation (M&S) tools and techniques over the past years, there has been a gap in the M&S research and practice in healthcare on developing a toolkit to assist the modellers and simulation practitioners with selecting an appropriate set of techniques. This study is a preliminary step towards this goal. This paper presents some results from a systematic literature survey on applications of M&S in the commerce and defence domains that could inspire some improvements in the healthcare. Interim results show that in the commercial sector Discrete-Event Simulation (DES) has been the most widely used technique with System Dynamics (SD) in second place. However in the defence sector, SD has gained relatively more attention. SD has been found quite useful for qualitative and soft factors analysis. From both the surveys it becomes clear that there is a growing trend towards using hybrid M&S approaches
Genetic and Neuroanatomical Support for Functional Brain Network Dynamics in Epilepsy
Focal epilepsy is a devastating neurological disorder that affects an
overwhelming number of patients worldwide, many of whom prove resistant to
medication. The efficacy of current innovative technologies for the treatment
of these patients has been stalled by the lack of accurate and effective
methods to fuse multimodal neuroimaging data to map anatomical targets driving
seizure dynamics. Here we propose a parsimonious model that explains how
large-scale anatomical networks and shared genetic constraints shape
inter-regional communication in focal epilepsy. In extensive ECoG recordings
acquired from a group of patients with medically refractory focal-onset
epilepsy, we find that ictal and preictal functional brain network dynamics can
be accurately predicted from features of brain anatomy and geometry, patterns
of white matter connectivity, and constraints complicit in patterns of gene
coexpression, all of which are conserved across healthy adult populations.
Moreover, we uncover evidence that markers of non-conserved architecture,
potentially driven by idiosyncratic pathology of single subjects, are most
prevalent in high frequency ictal dynamics and low frequency preictal dynamics.
Finally, we find that ictal dynamics are better predicted by white matter
features and more poorly predicted by geometry and genetic constraints than
preictal dynamics, suggesting that the functional brain network dynamics
manifest in seizures rely on - and may directly propagate along - underlying
white matter structure that is largely conserved across humans. Broadly, our
work offers insights into the generic architectural principles of the human
brain that impact seizure dynamics, and could be extended to further our
understanding, models, and predictions of subject-level pathology and response
to intervention
Seizure-onset mapping based on time-variant multivariate functional connectivity analysis of high-dimensional intracranial EEG : a Kalman filter approach
The visual interpretation of intracranial EEG (iEEG) is the standard method used in complex epilepsy surgery cases to map the regions of seizure onset targeted for resection. Still, visual iEEG analysis is labor-intensive and biased due to interpreter dependency. Multivariate parametric functional connectivity measures using adaptive autoregressive (AR) modeling of the iEEG signals based on the Kalman filter algorithm have been used successfully to localize the electrographic seizure onsets. Due to their high computational cost, these methods have been applied to a limited number of iEEG time-series (< 60). The aim of this study was to test two Kalman filter implementations, a well-known multivariate adaptive AR model (Arnold et al. 1998) and a simplified, computationally efficient derivation of it, for their potential application to connectivity analysis of high-dimensional (up to 192 channels) iEEG data. When used on simulated seizures together with a multivariate connectivity estimator, the partial directed coherence, the two AR models were compared for their ability to reconstitute the designed seizure signal connections from noisy data. Next, focal seizures from iEEG recordings (73-113 channels) in three patients rendered seizure-free after surgery were mapped with the outdegree, a graph-theory index of outward directed connectivity. Simulation results indicated high levels of mapping accuracy for the two models in the presence of low-to-moderate noise cross-correlation. Accordingly, both AR models correctly mapped the real seizure onset to the resection volume. This study supports the possibility of conducting fully data-driven multivariate connectivity estimations on high-dimensional iEEG datasets using the Kalman filter approach
Comparison of panel codes for aerodynamic analysis of airfoils
CieÄľom tejto práce bolo vytvorenie prehÄľadu v súčasnosti pouĹľĂvanĂ˝ch implementáciĂ panelovĂ˝ch metĂłd pre aerodynamickĂ© vĂ˝poÄŤty charakteristĂk 2D profilov. ZákladnĂ˝ popis princĂpu panelovej metĂłdy, porovnanie jednotlivĂ˝ch implementáciĂ a zhodnotenie ich moĹľnostĂ (presnosĹĄ, aplikovateÄľnosĹĄ) na typickĂ© Ăşlohy. V práci boli pouĹľitĂ© tri rĂ´zne panelovĂ© programy: Xfoil, JavaFoil a XFLR5. Práca bola obohatená o meranie v aerodynamickom tuneli.The purpose of this study is to create an overview of currently the most used panel codes for computation of aerodynamic characteristics of 2D airfoils. Description of the basic principles of panel code, comparison of various implementation and evaluation (accuracy, applicability) for typical tasks. In this thesis there were used three different panel codes: Xfoil, JavaFoil and XFLR5. Thesis was enriched by measurement in wind tunnel.
- …