11 research outputs found

    Conformance Checking Based on Multi-Perspective Declarative Process Models

    Full text link
    Process mining is a family of techniques that aim at analyzing business process execution data recorded in event logs. Conformance checking is a branch of this discipline embracing approaches for verifying whether the behavior of a process, as recorded in a log, is in line with some expected behaviors provided in the form of a process model. The majority of these approaches require the input process model to be procedural (e.g., a Petri net). However, in turbulent environments, characterized by high variability, the process behavior is less stable and predictable. In these environments, procedural process models are less suitable to describe a business process. Declarative specifications, working in an open world assumption, allow the modeler to express several possible execution paths as a compact set of constraints. Any process execution that does not contradict these constraints is allowed. One of the open challenges in the context of conformance checking with declarative models is the capability of supporting multi-perspective specifications. In this paper, we close this gap by providing a framework for conformance checking based on MP-Declare, a multi-perspective version of the declarative process modeling language Declare. The approach has been implemented in the process mining tool ProM and has been experimented in three real life case studies

    Enhancing BPMN Conformance Checking with OR Gateways and Data Objects

    Get PDF
    Äriprotsessimudel ja -notatsioon (BPMN) on arenev standard äriprotsesside graafiliseks kujutamiseks. Protsessimudel kirjeldab, kuidas äriprotsess peaks toimima. Kui äriprotsessi tegelikust käitamisest on saadaval ka sündmuste logi, on võimalik vastata küsimusele, kas protsessimudel vastab tegelikkusele. Vastavusanalüüs püüab tuvastada mittevastavusi protsessimudeli ja äriprotsessi käitamisel tekkinud sündmuste logi vahel. BPMN vastavuseanalüsaator on üks Itaalia ettevõtte SIAV-i poolt arendatud protsessikaeve tööriista osadest. Nimetatud tööriistal on aga puudujäägid formaalse semantika osas. Nimelt keskendub vastavusanalüüs järgnevuse voole (control-flow) protsessis, kuid jätab arvesse võtmata andmetevahelisi sõltuvusi. Lisaks ei ole vastavusanalüüsil võimalik kasutada protsessimudeleid, mis sisaldavad OR väravaid (OR gateway). OR-join omab mitme-tähenduslikku semantikat. Se lle konstruktsiooni jaoks on pakutud mitmeid formaalseid semantikaid sarnastes keeltes, nagu EPCs ja YAWL. Nimetatud semantikate kasutatamine mudelite käitamisel ja vastavuse analaüüsil on aga arvutuslikult kulukas. Seega on käesolevas lõputöös implementeeritud OR värava aktiveerimine lineaarse ajalise sõltuvusega mudeli suuruse suhtes. Kuna SIAV-i vastavusanalüsaator ei võta arvesse andmetevahelisi sõltuvusi, võib puudulik analüüs viia vigase vastavusdiagnostikani. Näiteks võib andmeatribuut anda infot selle kohta, et käitati vale tegevus. Kirjeldatud põhjustel ei peaks vastavusanalüsaator tegelema vaid järgnevuse voo vastavuse analüüsiga, vaid peaks arvesse võtma ka andmeid ja nendevahelisi sõltuvusi ning aega. Käesoleva töö teises osas täiendati olemasolevat andmeanalüsaatorit andmeatribuutidega.The Business Process Model and Notation is a developing standard for capturing business processes. Process models describe how the business process is expected to be executed. When a log is available from process executions, this situation raises the interesting question “Are the model and the log conformant?". Conformance checking, also referred to as conformance analysis, aims at the detection of inconsistencies between a process model and its corresponding execution log.The BPMN conformance checker, as a part of a process mining tool, developed an Italian company called SIAV, however, this tool lacks some formal semantics. In particular, the previous conformance checking approach in SIAV tends to focus on the control-flow in a process, while abstracting from data dependencies and process models containing OR gateways could not be used.OR-join has an ambiguous semantics. The several formal semantics of this construct have been proposed for similar languages such as EPCs and YAWL. However, executing and verifying models using these semantics is computationally expensive. Therefore, in this thesis, we implemented enablement of an OR-join in linear time in the size of the workflow graph.Data dependencies are also not considered in conformance checker developed in SIAV, which may lead to misleading conformance diagnostics. For example, a data attribute may provide strong evidence that the wrong activity was executed. That’s why the conformance checker should not only describe the process behaviour from the control flow point of view, but also from other perspectives like data or time. In the second part of the thesis, we enhanced the existing conformance checker with data attributes

    A Temporal Logic-Based Measurement Framework for Process Mining

    Get PDF
    The assessment of behavioral rules with respect to a given dataset is key in several research areas, including declarative process mining, association rule mining, and specification mining. The assessment is required to check how well a set of discovered rules describes the input data, as well as to determine to what extent data complies with predefined rules. In declarative process mining, in particular, some measures have been taken from association rule mining and adapted to support the assessment of temporal rules on event logs. Among them, support and confidence are used more often, yet they are reportedly unable to provide a sufficiently rich feedback to users and often cause spurious rules to be discovered from logs. In addition, these measures are designed to work on a predefined set of rules, thus lacking generality and extensibility. In this paper, we address this research gap by developing a general measurement framework for temporal rules based on Linear-time Temporal Logic with Past on Finite Traces (LTLpf). The framework is independent from the rule-specification language of choice and allows users to define new measures. We show that our framework can seamlessly adapt well-known measures of the association rule mining field to declarative process mining. Also, we test our software prototype implementing the framework on synthetic and real-world data, and investigate the properties characterizing those measures in the context of process analysis

    Runtime Monitoring of Data-Aware business rules with Integer Linear Programming

    Get PDF
    Käitusaegne seire (Runtime Compliance Monitoring) on oluline osa äriprotsesside halduse elutsüklis, mittevastavuse õigeaegses avastamises, samuti vastumeetmete korraldamises ja ennetamises. Täpsemalt on see seotud operatiivse otsuse toega, mille eesmärgiks on laiendada protsessikaeve tehnikat sidusrežiimis, käitada protsessi isendeid nii, et kõrvalekaldeid on võimalik avastada, ning on võimalik soovitada, mida võiks järgmiseks teha, ning samuti ennustada, mis hakkab juhtuma tulevaste juhtumite täitmisel. Antud magistritöö keskendub käitusaegse seire andmeteadlikele ärireeglitele. Töös kasutatakse varajaste rikkumiste tuvastamiseks lineaarset täisarvulist planeerimist (Integer Linear Programming (ILP)), mida rakendatakse kahe või enama kitsenduse koosmõjul. Töökorras toepakkujas on rakendatud protsessikaeve raamistikku ProM ja meetod on valideeritud kasutades sünteetilisi ja reaalseid logisid.Runtime Compliance Monitoring is vital building block in the Business Process Management lifecycle, in timely detection of non-compliance as well as provision of responsive and proactive countermeasures. In particular, it is linked to operational decision support, which aims at extending the application of process mining techniques to on-line, running process instances, so that deviations can be detected and it is possible to recommend what to do next and predict what will happen in the future instance execution. \n\r\n\rIn this thesis, we focus on Runtime Compliance Monitoring of data-aware business rules. In particular, we use Integer Linear Programming (ILP) for early detection of violations that occur from interplay of two or more constraints. An operational support provider has been implemented as part of process mining framework ProM and the approach has been validated using synthetic and real life logs

    Conformance checking and diagnosis for declarative business process models in data-aware scenarios

    Get PDF
    A business process (BP) consists of a set of activities which are performed in coordination in an organizational and technical environment and which jointly realize a business goal. In such context, BP management (BPM) can be seen as supporting BPs using methods, techniques, and software in order to design, enact, control, and analyze operational processes involving humans, organizations, applications, and other sources of information. Since the accurate management of BPs is receiving increasing attention, conformance checking, i.e., verifying whether the observed behavior matches a modelled behavior, is becoming more and more critical. Moreover, declarative languages are more frequently used to provide an increased flexibility. However, whereas there exist solid conformance checking techniques for imperative models, little work has been conducted for declarative models. Furthermore, only control-flow perspective is usually considered although other perspectives (e.g., data) are crucial. In addition, most approaches exclusively check the conformance without providing any related diagnostics. To enhance the accurate management of flexible BPs, this work presents a constraint-based approach for conformance checking over declarative BP models (including both control-flow and data perspectives). In addition, two constraint-based proposals for providing related diagnosis are detailed. To demonstrate both the effectiveness and the efficiency of the proposed approaches, the analysis of different performance measures related to a wide diversified set of test models of varying complexity has been performed.Ministerio de Ciencia e Innovación TIN2009-1371

    Towards a decision-aware declarative process modeling language for knowledge-intensive processes

    Get PDF
    Modeling loosely framed and knowledge-intensive business processes with the currently available process modeling languages is very challenging. Some lack the flexibility to model this type of processes, while others are missing one or more-perspectives needed to add the necessary level of detail to the models. In this paper we have composed a list of requirements that a modeling language should fulfil in order to adequately support the modeling of this type of processes. Based on these requirements, a metamodel for a new modeling language was developed that satisfies them all. The new language, called DeciClare, incorporates parts of several existing modeling languages, integrating them with new solutions to requirements that had not yet been met, Deciclare is a declarative modeling language at its core, and therefore, can inherently deal with the flexibility required to model loosely framed processes. The complementary resource and data perspectives add the capability to reason about, respectively, resources and data values. The latter makes it possible to encapsulate the knowledge that governs the process flow by offering support for decision modeling. The abstract syntax of DeciClare has been implemented in the form of an Ecore model. Based on this implementation, the language-domain appropriateness of the language was validated by domain experts using the arm fracture case as application scenario. (C) 2017 Elsevier Ltd. All rights reserved

    Sistematização da análise de conformidade dos processos na área de saúde : Sariah Ester Torno Mourão

    Get PDF
    Orientadores : Prof. Dr. Ricardo Mendes Junior, Prof. Dr. José Eduardo Pécora Junior, Profª Drª Adriana de Paula Lacerda SantosCoorientador : Prof. Dr. Eduardo Alves Portela SantosDissertação (mestrado) - Universidade Federal do Paraná, Setor de Tecnologia, Programa de Pós-Graduação em Engenharia de Produção. Defesa: Curitiba, 24/02/2017Inclui referências : f. 155-164Resumo: Os processos de tratamento médico são universalmente realizados de acordo com as orientações clínicas. No entanto, existe uma lacuna entre estas orientações e a prática clínica real, ou seja, há diferenças entre as atividades executadas e as atividades recomendadas nos procedimentos. Portanto, um desafio para o setor de gestão da saúde é abordar essa lacuna. Existindo assim uma necessidade de métodos que possam medir a adesão do comportamento real do processo no que diz respeito ao comportamento esperado; identificar onde os desvios acontecem com mais frequência e; produzir resultados que possam ser facilmente compreendidos por médicos para destacar as causas mais comuns dos desvios identificados. Este é um dos objetivos da mineração de processos. A mineração de processos fornece uma imagem real do que está acontecendo, explicitando diversas perspectivas acerca das atividades, recursos e informações dos processos. Esta área de estudo está preocupada com a descoberta, monitoramento e melhoria dos processos operacionais por meio da extração de conhecimento a partir de registros gerados pelos sistemas de informação. O principal objetivo desta pesquisa é sistematizar a análise de conformidade dos processos na área da saúde tendo como estudo empírico quantitativo os processos de tratamento de pacientes com Acidente Vascular Cerebral Isquêmico (AVCI), elegíveis para trombólise endovenosa, intraarterial e mecânica, do Hospital Municipal São José, localizado na cidade de Joinville - Santa Catarina. De acordo com os estudos da Global Burden of Disease 2013 - Mortality and Causes of Death (MURRAY et al., 2015), o Acidente Vascular Cerebral (AVC) é uma das mais importantes doenças crônicas, em termos de abrangência, sendo a terceira principal causa de morte no Brasil e a principal causa de incapacidade no mundo. A sistematização proposta busca, auxiliar pesquisadores e instituições de saúde na aplicação das técnicas de descoberta e análise de conformidade dos processos na área da saúde, de modo que tais técnicas possam contribuir para a melhoria do fluxo de atividades em estabelecimentos de saúde e, consequentemente, gerar um efeito positivo sobre a saúde no Brasil. É formada por nove etapas que envolvem: o conhecimento do Sistema de Informação Hospitalar (SIH) e da base de dados; a preparação da base de dados para aplicação das técnicas de mineração de processos; o estudo dos protocolos assistenciais, procedimentos operacionais padrão, instruções de trabalho e outros documentos feitos pela instituição de saúde para o processo selecionado; o estudo das diretrizes clínicas, regulamentos, normas, leis e outros documentos que envolvem o processo escolhido; a transcrição dos documentos selecionados em notação Business Process Management and Notation; a correlação entre os protocolos assistenciais, procedimentos operacionais padrão, instruções de trabalho e outros documentos com as diretrizes clínicas, regulamentos, normas, leis e outros documentos; e por fim as análises quantitativas usando técnicas de mineração de processos, tais como, descoberta do modelo do processo real e análises de conformidade para confrontar os modelos dos processos com os registros de eventos. Palavras-chave: Mineração de Processos. Descoberta. Análise de Conformidade. Sistematização. Área da Saúde. Acidente Vascular Cerebral.Abstract: The medical treatment processes are universally performed according to clinical guidelines. However, there is a gap between these guidelines and the actual clinical practice, that is, there are differences between the activities performed and the activities recommended in the procedures. Therefore, a challenge for the health management sector is to address the gap between actual clinical processes and the recommendations given in the procedures. Thus, there is an urgent need for methods that can: measure adherence to the actual behavior of the process with respect to expected behavior; identify where deviations occur most often and; produce results that can be easily understood by physicians to highlight the most common causes of identified deviations. This is one of the objectives of process mining. Process mining provides a true picture of what is happening, spelling out diverse perspectives on process activities, resources, and information. This area of study is concerned with the discovery, monitoring and improvement of operational processes through the extraction of knowledge from records generated by information systems. The main objective of this research is to systematize the conformity analysis of the health processes, having as a quantitative empirical study the procedures for the treatment of patients with Ischemic Stroke, eligible for intravenous, intraarterial and mechanical thrombolysis, of the São José Municipal Hospital, located in the city of Joinville - Santa Catarina. According to studies by the Global Burden of Disease 2013 (MURRAY et al., 2015), stroke is one of the most important chronic diseases in terms of outreach, being the third major cause of death in Brazil and the leading cause of disability in the world. The proposed systematization seeks to assist researchers and health institutions in the application of the techniques of discovery and analysis of conformity of the processes in the health area, so that such techniques can contribute to the improvement of the flow of activities in health facilities and, consequently, positive effect on health in Brazil. It is formed by nine steps that involve: the knowledge of the Hospital Information System (SIH) and the database; the preparation of the database for application of process mining techniques; the study of care protocols, standard operating procedures, work instructions and other documents made by the health institution for the selected process; the study of clinical guidelines, regulations, norms, laws and other documents that involve the chosen process; the transcription of the documents selected in notation Business Process Management and Notation; the correlation between care protocols, standard operating procedures, work instructions and other documents with clinical guidelines, regulations, standards, laws and other documents; and finally the quantitative analyzes using process mining, such as real process model discovery and compliance analyzes to compare process models with event logs. Keywords: Process Mining. Discovery. Conformance Checker. Systematization. Healthcare. Stroke

    Techniques for a posteriori analysis of declarative processes

    No full text
    The increasing availability of event data recorded by information systems, electronic devices, web services and sensor networks provides detailed information about the actual processes in systems and organizations. Process mining techniques can use such event data to discover processes and check the conformance of process models. For conformance checking, we need to analyze whether the observed behavior matches the modeled behavior. In such settings, it is often desirable to specify the expected behavior in terms of a declarative process model rather than of a detailed procedural model. However, declarative models do not have an explicit notion of state, thus making it more difficult to pinpoint deviations and to explain and quantify discrepancies. This paper focuses on providing high-quality and understandable diagnostics. The notion of activation plays a key role in determining the effect of individual events on a given constraint. Using this notion, we are able to show cause-and-effect relations and measure the healthiness of the process
    corecore