31,976 research outputs found
Knowledge-Intensive Processes: Characteristics, Requirements and Analysis of Contemporary Approaches
Engineering of knowledge-intensive processes (KiPs) is far from being mastered, since they are genuinely knowledge- and data-centric, and require substantial flexibility, at both design- and run-time. In this work, starting from a scientific literature analysis in the area of KiPs and from three real-world domains and application scenarios, we provide a precise characterization of KiPs. Furthermore, we devise some general requirements related to KiPs management and execution. Such requirements contribute to the definition of an evaluation framework to assess current system support for KiPs. To this end, we present a critical analysis on a number of existing process-oriented approaches by discussing their efficacy against the requirements
Scalable Bayesian modeling, monitoring and analysis of dynamic network flow data
Traffic flow count data in networks arise in many applications, such as
automobile or aviation transportation, certain directed social network
contexts, and Internet studies. Using an example of Internet browser traffic
flow through site-segments of an international news website, we present
Bayesian analyses of two linked classes of models which, in tandem, allow fast,
scalable and interpretable Bayesian inference. We first develop flexible
state-space models for streaming count data, able to adaptively characterize
and quantify network dynamics efficiently in real-time. We then use these
models as emulators of more structured, time-varying gravity models that allow
formal dissection of network dynamics. This yields interpretable inferences on
traffic flow characteristics, and on dynamics in interactions among network
nodes. Bayesian monitoring theory defines a strategy for sequential model
assessment and adaptation in cases when network flow data deviates from
model-based predictions. Exploratory and sequential monitoring analyses of
evolving traffic on a network of web site-segments in e-commerce demonstrate
the utility of this coupled Bayesian emulation approach to analysis of
streaming network count data.Comment: 29 pages, 16 figure
Understanding Complex Systems: From Networks to Optimal Higher-Order Models
To better understand the structure and function of complex systems,
researchers often represent direct interactions between components in complex
systems with networks, assuming that indirect influence between distant
components can be modelled by paths. Such network models assume that actual
paths are memoryless. That is, the way a path continues as it passes through a
node does not depend on where it came from. Recent studies of data on actual
paths in complex systems question this assumption and instead indicate that
memory in paths does have considerable impact on central methods in network
science. A growing research community working with so-called higher-order
network models addresses this issue, seeking to take advantage of information
that conventional network representations disregard. Here we summarise the
progress in this area and outline remaining challenges calling for more
research.Comment: 8 pages, 4 figure
Portinari: A Data Exploration Tool to Personalize Cervical Cancer Screening
Socio-technical systems play an important role in public health screening
programs to prevent cancer. Cervical cancer incidence has significantly
decreased in countries that developed systems for organized screening engaging
medical practitioners, laboratories and patients. The system automatically
identifies individuals at risk of developing the disease and invites them for a
screening exam or a follow-up exam conducted by medical professionals. A triage
algorithm in the system aims to reduce unnecessary screening exams for
individuals at low-risk while detecting and treating individuals at high-risk.
Despite the general success of screening, the triage algorithm is a
one-size-fits all approach that is not personalized to a patient. This can
easily be observed in historical data from screening exams. Often patients rely
on personal factors to determine that they are either at high risk or not at
risk at all and take action at their own discretion. Can exploring patient
trajectories help hypothesize personal factors leading to their decisions? We
present Portinari, a data exploration tool to query and visualize future
trajectories of patients who have undergone a specific sequence of screening
exams. The web-based tool contains (a) a visual query interface (b) a backend
graph database of events in patients' lives (c) trajectory visualization using
sankey diagrams. We use Portinari to explore diverse trajectories of patients
following the Norwegian triage algorithm. The trajectories demonstrated
variable degrees of adherence to the triage algorithm and allowed
epidemiologists to hypothesize about the possible causes.Comment: Conference paper published at ICSE 2017 Buenos Aires, at the Software
Engineering in Society Track. 10 pages, 5 figure
Security and confidentiality approach for the Clinical E-Science Framework (CLEF)
CLEF is an MRC sponsored project in the E-Science programme that aims to
establish policies and infrastructure for the next generation of integrated clinical and
bioscience research. One of the major goals of the project is to provide a
pseudonymised repository of histories of cancer patients that can be accessed by
researchers. Robust mechanisms and policies are needed to ensure that patient
privacy and confidentiality are preserved while delivering a repository of such
medically rich information for the purposes of scientific research. This paper
summarises the overall approach adopted by CLEF to meet data protection
requirements, including the data flows and pseudonymisation mechanisms that are
currently being developed. Intended constraints and monitoring policies that will
apply to research interrogation of the repository are also outlined. Once evaluated, it
is hoped that the CLEF approach can serve as a model for other distributed
electronic health record repositories to be accessed for research
A Process Modelling Framework Based on Point Interval Temporal Logic with an Application to Modelling Patient Flows
This thesis considers an application of a temporal theory to describe and model the patient journey in the hospital accident and emergency (A&E) department. The aim is to introduce a generic but dynamic method applied to any setting, including healthcare. Constructing a consistent process model can be instrumental in streamlining healthcare issues. Current process modelling techniques used in healthcare such as flowcharts, unified modelling language activity diagram (UML AD), and business process modelling notation (BPMN) are intuitive and imprecise. They cannot fully capture the complexities of the types of activities and the full extent of temporal constraints to an extent where one could reason about the flows. Formal approaches such as Petri have also been reviewed to investigate their applicability to the healthcare domain to model processes.
Additionally, to schedule patient flows, current modelling standards do not offer any formal mechanism, so healthcare relies on critical path method (CPM) and program evaluation review technique (PERT), that also have limitations, i.e. finish-start barrier. It is imperative to specify the temporal constraints between the start and/or end of a process, e.g., the beginning of a process A precedes the start (or end) of a process B. However, these approaches failed to provide us with a mechanism for handling these temporal situations. If provided, a formal representation can assist in effective knowledge representation and quality enhancement concerning a process. Also, it would help in uncovering complexities of a system and assist in modelling it in a consistent way which is not possible with the existing modelling techniques.
The above issues are addressed in this thesis by proposing a framework that would provide a knowledge base to model patient flows for accurate representation based on point interval temporal logic (PITL) that treats point and interval as primitives. These objects would constitute the knowledge base for the formal description of a system. With the aid of the inference mechanism of the temporal theory presented here, exhaustive temporal constraints derived from the proposed axiomatic system’ components serves as a knowledge base.
The proposed methodological framework would adopt a model-theoretic approach in which a theory is developed and considered as a model while the corresponding instance is considered as its application. Using this approach would assist in identifying core components of the system and their precise operation representing a real-life domain deemed suitable to the process modelling issues specified in this thesis. Thus, I have evaluated the modelling standards for their most-used terminologies and constructs to identify their key components. It will also assist in the generalisation of the critical terms (of process modelling standards) based on their ontology. A set of generalised terms proposed would serve as an enumeration of the theory and subsume the core modelling elements of the process modelling standards. The catalogue presents a knowledge base for the business and healthcare domains, and its components are formally defined (semantics). Furthermore, a resolution theorem-proof is used to show the structural features of the theory (model) to establish it is sound and complete.
After establishing that the theory is sound and complete, the next step is to provide the instantiation of the theory. This is achieved by mapping the core components of the theory to their corresponding instances. Additionally, a formal graphical tool termed as point graph (PG) is used to visualise the cases of the proposed axiomatic system. PG facilitates in modelling, and scheduling patient flows and enables analysing existing models for possible inaccuracies and inconsistencies supported by a reasoning mechanism based on PITL. Following that, a transformation is developed to map the core modelling components of the standards into the extended PG (PG*) based on the semantics presented by the axiomatic system.
A real-life case (from the King’s College hospital accident and emergency (A&E) department’s trauma patient pathway) is considered to validate the framework. It is divided into three patient flows to depict the journey of a patient with significant trauma, arriving at A&E, undergoing a procedure and subsequently discharged. Their staff relied upon the UML-AD and BPMN to model the patient flows. An evaluation of their representation is presented to show the shortfalls of the modelling standards to model patient flows. The last step is to model these patient flows using the developed approach, which is supported by enhanced reasoning and scheduling
- …