147 research outputs found
Mayflower - Explorative Modeling of Scientific Workflows with BPEL
Using workflows for scientific calculations, experiments and simulations has been a success story in many cases. Unfortunately, most of the existing scientific workflow systems implement proprietary, non-standardized workflow languages, not taking advantage of the achievements of the conventional business workflow technology. It is only natural to combine these two research branches in order to harness the strengths of both. In this demonstration, we present Mayflower, a workflow environment that enables scientists to model workflows on the fly using extended business workflow technology. It supports the typical trial-and-error approach scientists follow when developing their experiments, computations or simulations and provides scientists with all crucial characteristics of the workflow technology. Additionally, beneficial to the business stakeholders, Mayflower brings additional simplification in workflow development and debugging
Advancements and Challenges in Object-Centric Process Mining: A Systematic Literature Review
Recent years have seen the emergence of object-centric process mining
techniques. Born as a response to the limitations of traditional process mining
in analyzing event data from prevalent information systems like CRM and ERP,
these techniques aim to tackle the deficiency, convergence, and divergence
issues seen in traditional event logs. Despite the promise, the adoption in
real-world process mining analyses remains limited. This paper embarks on a
comprehensive literature review of object-centric process mining, providing
insights into the current status of the discipline and its historical
trajectory
A visual analysis of the process of process modeling
The construction of business process models has become an important requisite
in the analysis and optimization of processes. The success of the analysis and
optimization efforts heavily depends on the quality of the models. Therefore, a
research domain emerged that studies the process of process modeling. This
paper contributes to this research by presenting a way of visualizing the
different steps a modeler undertakes to construct a process model, in a
so-called process of process modeling Chart. The graphical representation
lowers the cognitive efforts to discover properties of the modeling process,
which facilitates the research and the development of theory, training and tool
support for improving model quality. The paper contains an extensive overview
of applications of the tool that demonstrate its usefulness for research and
practice and discusses the observations from the visualization in relation to
other work. The visualization was evaluated through a qualitative study that
confirmed its usefulness and added value compared to the Dotted Chart on which
the visualization was inspired
Repairing Alignments of Process Models
Process mining represents a collection of data driven techniques that support the analysis, understanding and improvement of business processes. A core branch of process mining is conformance checking, i.e., assessing to what extent a business process model conforms to observed business process execution data. Alignments are the de facto standard instrument to compute such conformance statistics. However, computing alignments is a combinatorial problem and hence extremely costly. At the same time, many process models share a similar structure and/or a great deal of behavior. For collections of such models, computing alignments from scratch is inefficient, since large parts of the alignments are likely to be the same. This paper presents a technique that exploits process model similarity and repairs existing alignments by updating those parts that do not fit a given process model. The technique effectively reduces the size of the combinatorial alignment problem, and hence decreases computation time significantly. Moreover, the potential loss of optimality is limited and stays within acceptable bounds
Process mining meets model learning: Discovering deterministic finite state automata from event logs for business process analysis
Within the process mining field, Deterministic Finite State Automata (DFAs) are largely employed as foundation mechanisms to perform formal reasoning tasks over the information contained in the event logs, such as conformance checking, compliance monitoring and cross-organization process analysis, just to name a few. To support the above use cases, in this paper, we investigate how to leverage Model Learning (ML) algorithms for the automated discovery of DFAs from event logs. DFAs can be used as a fundamental building block to support not only the development of process analysis techniques, but also the implementation of instruments to support other phases of the Business Process Management (BPM) lifecycle such as business process design and enactment. The quality of the discovered DFAs is assessed wrt customized definitions of fitness, precision, generalization, and a standard notion of DFA simplicity. Finally, we use these metrics to benchmark ML algorithms against real-life and synthetically generated datasets, with the aim of studying their performance and investigate their suitability to be used for the development of BPM tools
An Approach for Modeling and Coordinating Process Interactions
In any enterprise, different entities collaborate to achieve common business objectives. The processes used to reach these objectives have relations and, therefore, depend on each other. Their proper coordination within a process-aware information system requires coping with heterogeneous granularity of processes, unclear process relations, and increased process model complexity due
to the integration of coordination constraints into process models. This paper presents the concept of coordination processes, which constitute a means to handle the interactions between a multitude of interdependent processes running asynchronously to each other. Particularly, coordination processes leverage the clear identification of process relations, a defined granularity for processes, and the abstraction from details of the individual processes in order to provide a robust
framework, enabling proper coordination support for interdependent processes
- …