460 research outputs found
Trace alignment in process mining: Opportunities for process diagnostics
Abstract. Process mining techniques attempt to extract non-trivial knowledge and interesting insights from event logs. Process mining provides a welcome extension of the repertoire of business process analysis techniques and has been adopted in various commercial BPM systems (BPM|one, Futura Reflect, ARIS PPM, Fujitsu, etc.). Unfortunately, traditional process discovery algorithms have problems dealing with lessstructured processes. The resulting models are difficult to comprehend or even misleading. Therefore, we propose a new approach based on trace alignment. The goal is to align traces in a way that event logs can be explored easily. Trace alignment can be used in a preprocessing phase where the event log is investigated or filtered and in later phases where detailed questions need to be answered. Hence, it complements existing process mining techniques focusing on discovery and conformance checking
Enhancing workflow-nets with data for trace completion
The growing adoption of IT-systems for modeling and executing (business)
processes or services has thrust the scientific investigation towards
techniques and tools which support more complex forms of process analysis. Many
of them, such as conformance checking, process alignment, mining and
enhancement, rely on complete observation of past (tracked and logged)
executions. In many real cases, however, the lack of human or IT-support on all
the steps of process execution, as well as information hiding and abstraction
of model and data, result in incomplete log information of both data and
activities. This paper tackles the issue of automatically repairing traces with
missing information by notably considering not only activities but also data
manipulated by them. Our technique recasts such a problem in a reachability
problem and provides an encoding in an action language which allows to
virtually use any state-of-the-art planning to return solutions
On Secure Workflow Decentralisation on the Internet
Decentralised workflow management systems are a new research area, where most
work to-date has focused on the system's overall architecture. As little
attention has been given to the security aspects in such systems, we follow a
security driven approach, and consider, from the perspective of available
security building blocks, how security can be implemented and what new
opportunities are presented when empowering the decentralised environment with
modern distributed security protocols. Our research is motivated by a more
general question of how to combine the positive enablers that email exchange
enjoys, with the general benefits of workflow systems, and more specifically
with the benefits that can be introduced in a decentralised environment. This
aims to equip email users with a set of tools to manage the semantics of a
message exchange, contents, participants and their roles in the exchange in an
environment that provides inherent assurances of security and privacy. This
work is based on a survey of contemporary distributed security protocols, and
considers how these protocols could be used in implementing a distributed
workflow management system with decentralised control . We review a set of
these protocols, focusing on the required message sequences in reviewing the
protocols, and discuss how these security protocols provide the foundations for
implementing core control-flow, data, and resource patterns in a distributed
workflow environment
Conformance checking using activity and trace embeddings
Conformance checking describes process mining techniques used to compare an event log and a corresponding process model. In this paper, we propose an entirely new approach to conformance checking based on neural network-based embeddings. These embeddings are vector representations of every activity/task present in the model and log, obtained via act2vec, a Word2vec based model. Our novel conformance checking approach applies the Word Mover’s Distance to the activity embeddings of traces in order to measure fitness and precision. In addition, we investigate a more efficiently calculated lower bound of the former metric, i.e. the Iterative Constrained Transfers measure. An alternative method using trace2vec, a Doc2vec based model, to train and compare vector representations of the process instances themselves is also introduced. These methods are tested in different settings and compared to other conformance checking techniques, showing promising results
Know What You Stream: Generating Event Streams from CPN Models in ProM 6
Abstract. The field of process mining is concerned with supporting the analysis, improvement and understanding of business processes. A range of promising techniques have been proposed for process mining tasks such as process discovery and conformance checking. However there are challenges, originally stemming from the area of data mining, that have not been investigated extensively in context of process mining. In particular the incorporation of data stream mining techniques w.r.t. process mining has received little attention. In this paper, we present new developments that build on top of previous work related to the integration of data streams within the process mining framework ProM. We have developed means to use Coloured Petri Net (CPN) models as a basis for eventstream generation. The newly introduced functionality greatly enhances the use of event-streams in context of process mining as it allows us to be actively aware of the originating model of the event-stream under analysis
Towards process instances building for spaghetti processes
Abstract. Process Mining techniques aim at building a process model starting from an event log generated during the execution of the process. Classical process mining approaches have problems when dealing with Spaghetti Processes, i.e. processes with little or no structure, since they obtain very chaotic models. As a remedy, in previous works we proposed a methodology aimed at supporting the analysis of a spaghetti process by means of its most relevant subprocesses. Such approach exploits graph-mining techniques, thus requiring to reconstruct the set of process instances starting from the sequential traces stored in the event log. In the present work, we discuss the main problems related to process instances building in spaghetti contexts, and introduce a proposal for extending a process instance building technique to address such issues
Verification of Logs - Revealing Faulty Processes of a Medical Laboratory
Abstract. If there is a suspicion of Lyme disease, a blood sample of a patient is sent to a medical laboratory. The laboratory performs a number of dierent blood examinations testing for antibodies against the Lyme disease bacteria. The total number of dierent examinations depends on the intermediate results of the blood count. The costs of each examination is paid by the health insurance company of the patient. To control and restrict the number of performed examinations the health insurance companies provide a charges regulation document. If a health insurance company disagrees with the charges of a laboratory it is the job of the public prosecution service to validate the charges according to the regulation document. In this paper we present a case study showing a systematic approach to reveal faulty processes of a medical laboratory. First, files produced by the information system of the respective laboratory are analysed and consolidated in a database. An excerpt from this database is translated into an event log describing a sequential language of events performed by the information system. With the help of the regulation document this language can be split in two sets- the set of valid and the set of faulty words. In a next step, we build a coloured Petri net model corre-sponding to the set of valid words in a sense that only the valid words are executable in the Petri net model. In a last step we translated the coloured Petri net into a PL/SQL-program. This program can automat-ically reveal all faulty processes stored in the database.
- …