14 research outputs found

    Fluent Logic Workflow Analyser: A Tool for The Verification of Workflow Properties

    Full text link
    In this paper we present the design and implementation, as well as a use case, of a tool for workflow analysis. The tool provides an assistant for the specification of properties of a workflow model. The specification language for property description is Fluent Linear Time Temporal Logic. Fluents provide an adequate flexibility for capturing properties of workflows. Both the model and the properties are encoded, in an automated way, as Labelled Transition Systems, and the analysis is reduced to model checking.Comment: In Proceedings LAFM 2013, arXiv:1401.056

    Compliance to data protection and purpose control using process mining technique

    Get PDF
    The business processes of an organisation are executed incertain boundaries. Some of the restrictions are raised from the environment of the organisations such as regulatory and supervisory constraints. One of the regulations that is imposed on organisations is theEuropean General Data Protection Regulation (GDPR). The most important aspect of the GDPR rules is how organisations handle personaldata of their customers. In this research, we focus on this aspect of theGDPR. Our goal is to develop a solution that enables organisations todeal with the challenges of becoming compliant with GDPR. We plan touse and improve process mining techniques to tackle the problems suchas discovering data-flow and control-flow of business processes that haveinteraction with personal data of customers. Our approach consists offour phases: (1) discover process model based on purpose, (2) translateregulatory rules to technical rules, (3) develop privacy policy model baseon the GDPR, (4) conformance analysisThe business processes of an organisation are executed in certain boundaries. Some of the restrictions are raised from the environment of the organisations such as regulatory and supervisory constraints. One of the regulations that is imposed on organisations is the European General Data Protection Regulation (GDPR). The most important aspect of the GDPR rules is how organisations handle personal data of their customers. In this research, we focus on this aspect of the GDPR. Our goal is to develop a solution that enables organisations to deal with the challenges of becoming compliant with GDPR. We plan to use and improve process mining techniques to tackle the problems such as discovering data-flow and control-flow of business processes that have interaction with personal data of customers. Our approach consists of four phases: (1) discover process model based on purpose, (2) translate regulatory rules to technical rules, (3) develop privacy policy model base on the GDPR, (4) conformance analysis

    Compliance to data protection and purpose control using process mining technique

    Get PDF
    The business processes of an organisation are executed incertain boundaries. Some of the restrictions are raised from the environment of the organisations such as regulatory and supervisory constraints. One of the regulations that is imposed on organisations is theEuropean General Data Protection Regulation (GDPR). The most important aspect of the GDPR rules is how organisations handle personaldata of their customers. In this research, we focus on this aspect of theGDPR. Our goal is to develop a solution that enables organisations todeal with the challenges of becoming compliant with GDPR. We plan touse and improve process mining techniques to tackle the problems suchas discovering data-flow and control-flow of business processes that haveinteraction with personal data of customers. Our approach consists offour phases: (1) discover process model based on purpose, (2) translateregulatory rules to technical rules, (3) develop privacy policy model baseon the GDPR, (4) conformance analysisThe business processes of an organisation are executed in certain boundaries. Some of the restrictions are raised from the environment of the organisations such as regulatory and supervisory constraints. One of the regulations that is imposed on organisations is the European General Data Protection Regulation (GDPR). The most important aspect of the GDPR rules is how organisations handle personal data of their customers. In this research, we focus on this aspect of the GDPR. Our goal is to develop a solution that enables organisations to deal with the challenges of becoming compliant with GDPR. We plan to use and improve process mining techniques to tackle the problems such as discovering data-flow and control-flow of business processes that have interaction with personal data of customers. Our approach consists of four phases: (1) discover process model based on purpose, (2) translate regulatory rules to technical rules, (3) develop privacy policy model base on the GDPR, (4) conformance analysis

    Predictive Monitoring of Business Processes

    Full text link
    Modern information systems that support complex business processes generally maintain significant amounts of process execution data, particularly records of events corresponding to the execution of activities (event logs). In this paper, we present an approach to analyze such event logs in order to predictively monitor business goals during business process execution. At any point during an execution of a process, the user can define business goals in the form of linear temporal logic rules. When an activity is being executed, the framework identifies input data values that are more (or less) likely to lead to the achievement of each business goal. Unlike reactive compliance monitoring approaches that detect violations only after they have occurred, our predictive monitoring approach provides early advice so that users can steer ongoing process executions towards the achievement of business goals. In other words, violations are predicted (and potentially prevented) rather than merely detected. The approach has been implemented in the ProM process mining toolset and validated on a real-life log pertaining to the treatment of cancer patients in a large hospital

    Web-Based Modelling and Collaborative Simulation of Declarative Processes.

    Get PDF
    Abstract. As a provider of Electronic Case Management solutions to knowledge-intensive businesses and organizations, the Danish company Exformatics has in recent years identified a need for flexible process support in the tools that we pro-vide to our customers. We have addressed this need by adapting DCR Graphs, a formal declarative workflow notation developed at the IT University of Copen-hagen. Through close collaboration with academia we first integrated execution support for the notation into our existing tools, by leveraging a cloud-based pro-cess engine implementing the DCR formalism. Over the last two years we have taken this adoption of DCR Graphs to the next level and decided to treat the nota-tion as a product of its own by developing a stand-alone web-based collaborative portal for the modelling and simulation of declarative workflows. The purpose of the portal is to facilitate end-user discussions on how knowledge workers really work, by enabling collaborative simulation of processes. In earlier work we re-ported on the integration of DCR Graphs as a workflow execution formalism in the existing Exformatics ECM products. In this paper we report on the advances we have made over the last two years, we describe the new declarative process modelling portal, discuss its features, describe the process of its development, re-port on the findings of an initial evaluation of the usability of the tool, resulting from a tutorial on declarative modelling with DCR Graphs that we organized at last years BPM conference and present our plans for the future

    Declarative Modeling–An Academic Dream or the Future for BPM?

    Get PDF
    Declarative modeling has attracted much attention over the last years, resulting in the development of several academic declarative modeling techniques and tools. The absence of empirical evaluations on their use and usefulness, however, raises the question whether practitioners are attracted to using those techniques. In this paper, we present a study on what practitioners think of declarative modeling. We show that the practitioners we involved in this study are receptive to the idea of a hybrid approach combining imperative and declarative techniques, rather than making a full shift from the imperative to the declarative paradigm. Moreover, we report on requirements, use cases, limitations, and tool support of such a hybrid approach. Based on the gained insight, we propose a research agenda for the development of this novel modeling approach

    Conformance Checking Based on Multi-Perspective Declarative Process Models

    Full text link
    Process mining is a family of techniques that aim at analyzing business process execution data recorded in event logs. Conformance checking is a branch of this discipline embracing approaches for verifying whether the behavior of a process, as recorded in a log, is in line with some expected behaviors provided in the form of a process model. The majority of these approaches require the input process model to be procedural (e.g., a Petri net). However, in turbulent environments, characterized by high variability, the process behavior is less stable and predictable. In these environments, procedural process models are less suitable to describe a business process. Declarative specifications, working in an open world assumption, allow the modeler to express several possible execution paths as a compact set of constraints. Any process execution that does not contradict these constraints is allowed. One of the open challenges in the context of conformance checking with declarative models is the capability of supporting multi-perspective specifications. In this paper, we close this gap by providing a framework for conformance checking based on MP-Declare, a multi-perspective version of the declarative process modeling language Declare. The approach has been implemented in the process mining tool ProM and has been experimented in three real life case studies

    Enabling Flexibility in Process-Aware Information Systems: Challenges, Methods, Technologies

    Get PDF
    In today’s dynamic business world, the success of a company increasingly depends on its ability to react to changes in its environment in a quick and flexible way. Companies have therefore identified process agility as a competitive advantage to address business trends like increasing product and service variability or faster time to market, and to ensure business IT alignment. Along this trend, a new generation of information systems has emerged—so-called process-aware information systems (PAIS), like workflow management systems, case handling tools, and service orchestration engines. With this book, Reichert and Weber address these flexibility needs and provide an overview of PAIS with a strong focus on methods and technologies fostering flexibility for all phases of the process lifecycle (i.e., modeling, configuration, execution and evolution). Their presentation is divided into six parts. Part I starts with an introduction of fundamental PAIS concepts and establishes the context of process flexibility in the light of practical scenarios. Part II focuses on flexibility support for pre-specified processes, the currently predominant paradigm in the field of business process management (BPM). Part III details flexibility support for loosely specified processes, which only partially specify the process model at build-time, while decisions regarding the exact specification of certain model parts are deferred to the run-time. Part IV deals with user- and data-driven processes, which aim at a tight integration of processes and data, and hence enable an increased flexibility compared to traditional PAIS. Part V introduces existing technologies and systems for the realization of a flexible PAIS. Finally, Part VI summarizes the main ideas of this book and gives an outlook on advanced flexibility issues. The attached pdf file gives a preview on Chapter 3 of the book which explains the book's overall structure

    Context-aware Process Management for the Software Engineering Domain

    Get PDF
    Historically, software development projects are challenged with problems concerning budgets, deadlines and the quality of the produced software. Such problems have various causes like the high number of unplanned activities and the operational dynamics present in this domain. Most activities are knowledge-intensive and require collaboration of various actors. Additionally, the produced software is intangible and therefore difficult to measure. Thus, software producers are often insufficiently aware of the state of their source code, while suitable software quality measures are often applied too late in the project lifecycle, if at all. Software development processes are used by the majority of software companies to ensure the quality and reproducibility of their development endeavors. Typically, these processes are abstractly defined utilizing process models. However, they still need to be interpreted by individuals and be manually executed, resulting in governance and compliance issues. The environment is sufficiently dynamic that unforeseen situations can occur due to various events, leading to potential aberrations and process governance issues. Furthermore, as process models are implemented manually without automation support, they impose additional work for the executing humans. Their advantages often remain hidden as aligning the planned process with reality is cumbersome. In response to these problems, this thesis contributes the Context-aware Process Management (CPM) framework. The latter enables holistic and automated support for software engineering projects and their processes. In particular, it provides concepts for extending process management technology to support software engineering process models in their entirety. Furthermore, CPM contributes an approach to integrate the enactment of the process models better with the real-world process by introducing a set of contextual extensions. Various events occurring in the course of the projects can be utilized to improve process support and activities outside the realm of the process models can be covered. That way, the continuously growing divide between the plan and reality that often occurs in software engineering projects can be avoided. Finally, the CPM framework comprises facilities to better connect the software engineering process with other important aspects and areas of software engineering projects. This includes automated process-oriented support for software quality management or software engineering knowledge management. The CPM framework has been validated by a prototypical implementation, various sophisticated scenarios, and its practical application at two software companies
    corecore