5 research outputs found
The Internet-of-Things Meets Business Process Management: Mutual Benefits and Challenges
The Internet of Things (IoT) refers to a network of connected devices
collecting and exchanging data over the Internet. These things can be
artificial or natural, and interact as autonomous agents forming a complex
system. In turn, Business Process Management (BPM) was established to analyze,
discover, design, implement, execute, monitor and evolve collaborative business
processes within and across organizations. While the IoT and BPM have been
regarded as separate topics in research and practice, we strongly believe that
the management of IoT applications will strongly benefit from BPM concepts,
methods and technologies on the one hand; on the other one, the IoT poses
challenges that will require enhancements and extensions of the current
state-of-the-art in the BPM field. In this paper, we question to what extent
these two paradigms can be combined and we discuss the emerging challenges
Improving data preparation for the application of process mining
Immersed in what is already known as the fourth industrial revolution, automation and data exchange are taking on a particularly relevant role in complex environments, such as industrial manufacturing environments or logistics. This digitisation and transition to the Industry 4.0 paradigm is causing experts to start analysing business processes from other perspectives. Consequently, where management and business intelligence used to dominate, process mining appears as a link, trying to build a bridge between both disciplines to unite and improve them. This new perspective on process analysis helps to improve strategic decision making and competitive capabilities. Process mining brings together data and process perspectives in a single discipline that covers the entire spectrum of process management. Through process mining, and based on observations of their actual operations, organisations can understand the state of their operations, detect deviations, and improve their performance based on what they observe. In this way, process mining is an ally, occupying a large part of current academic and industrial research.
However, although this discipline is receiving more and more attention, it presents severe application problems when it is implemented in real environments. The variety of input data in terms of form, content, semantics, and levels of abstraction makes the execution of process mining tasks in industry an iterative, tedious, and manual process, requiring multidisciplinary experts with extensive knowledge of the domain, process management, and data processing. Currently, although there are numerous academic proposals, there are no industrial solutions capable of automating these tasks. For this reason, in this thesis by compendium we address the problem of improving business processes in complex environments thanks to the study of the state-of-the-art and a set of proposals that improve relevant aspects in the life cycle of processes, from the creation of logs, log preparation, process quality assessment, and improvement of business processes.
Firstly, for this thesis, a systematic study of the literature was carried out in order to gain an in-depth knowledge of the state-of-the-art in this field, as well as the different challenges faced by this discipline. This in-depth analysis has allowed us to detect a number of challenges that have not been addressed or received insufficient attention, of which three have been selected and presented as the objectives of this thesis. The first challenge is related to the assessment of the quality of input data, known as event logs, since the requeriment of the application of techniques for improving the event log must be based on the level of quality of the initial data, which is why this thesis presents a methodology and a set of metrics that support the expert in selecting which technique to apply to the data according to the quality estimation at each moment, another challenge obtained as a result of our analysis of the literature. Likewise, the use of a set of metrics to evaluate the quality of the resulting process models is also proposed, with the aim of assessing whether improvement in the quality of the input data has a direct impact on the final results.
The second challenge identified is the need to improve the input data used in the analysis of business processes. As in any data-driven discipline, the quality of the results strongly depends on the quality of the input data, so the second challenge to be addressed is the improvement of the preparation of event logs. The contribution in this area is the application of natural language processing techniques to relabel activities from textual descriptions of process activities, as well as the application of clustering techniques to help simplify the results, generating more understandable models from a human point of view.
Finally, the third challenge detected is related to the process optimisation, so we contribute with an approach for the optimisation of resources associated with business processes, which, through the inclusion of decision-making in the creation of flexible processes, enables significant cost reductions. Furthermore, all the proposals made in this thesis are validated and designed in collaboration with experts from different fields of industry and have been evaluated through real case studies in public and private projects in collaboration with the aeronautical industry and the logistics sector
Responsible Composition and Optimization of Integration Processes under Correctness Preserving Guarantees
Enterprise Application Integration deals with the problem of connecting
heterogeneous applications, and is the centerpiece of current on-premise, cloud
and device integration scenarios. For integration scenarios, structurally
correct composition of patterns into processes and improvements of integration
processes are crucial. In order to achieve this, we formalize compositions of
integration patterns based on their characteristics, and describe optimization
strategies that help to reduce the model complexity, and improve the process
execution efficiency using design time techniques. Using the formalism of timed
DB-nets - a refinement of Petri nets - we model integration logic features such
as control- and data flow, transactional data storage, compensation and
exception handling, and time aspects that are present in reoccurring solutions
as separate integration patterns. We then propose a realization of optimization
strategies using graph rewriting, and prove that the optimizations we consider
preserve both structural and functional correctness. We evaluate the
improvements on a real-world catalog of pattern compositions, containing over
900 integration processes, and illustrate the correctness properties in case
studies based on two of these processes.Comment: 37 page
Integrating Process-Oriented and Event-Based Systems (Dagstuhl Seminar 16341)
This report documents the programme and the outcomes of Dagstuhl Seminar 16341
on "Integrating Process-Oriented and Event-Based Systems", which took place
August 21--26, 2016, at Schloss Dagstuhl -- Leibniz Center for Informatics. The
seminar brought together researchers and practitioners from the communities
that have been established for research on process-oriented information systems
on the one hand, and event-based systems on the other hand. By exploring the
use of processes in event handling (from the distribution of event processing
to the assessment of event data quality), the use of events in processes (from
rich event semantics in processes to support for flexible BPM), and the role of
events in process choreographies, the seminar identified the diverse
connections between the scientific fields. This report summarises the outcomes
of the seminar by reviewing the state-of-the-art and outlining research
challenges on the intersection of process-oriented and event-based systems