5 research outputs found

    A Cross-Organizational Process Mining Framework for Obtaining Insights from Software Products: Accurate Comparison Challenges

    Get PDF
    Software vendors offer various software products to large numbers of enterprises to support their organization, in particular Enterprise Resource Planning (ERP) software. Each of these enterprises use the same product for similar goals, albeit with different processes and configurations. Therefore, software vendors want to obtain insights into how the enterprises use the software product, what the differences are in usage between enterprises, and the reasons behind these differences. Cross-organizational process mining is a possible solution to address these needs, as it aims at comparing enterprises based on their usage. In this paper, we present a novel Cross-Organizational Process Mining Framework which takes as input, besides event log, semantics (meaning of terms in an enterprise) and organizational context (characteristics of an enterprise). The framework provides reasoning capabilities to determine what to compare and how. Besides, the framework enables one to create a catalog of metrics by deducing diagnostics from the usage. By using this catalog, the framework can monitor the (positive) effects of changes on processes. An enterprise operating in a similar context might also benefit from the same changes. To accommodate these improvement suggestions, the framework creates an improvement catalog of observed changes. Later, we provide a set of challenges which have to be met in order to obtain the inputs from current products to show the feasibility of the framework. Next to this, we provide preliminary results showing they can be met and illustrate an example application of the framework in cooperation with an ERP software vendor

    Process Mining as a Service

    Get PDF
    Softwérové a hardvérové aplikácie zaznamenávajú veľké množstvo informácií do protokolov udalostí. Každé dva roky sa množstvo zaznamenaných dát viac než zdvojnásobí. Dolovanie procesov je relatívne mladá disciplína, ktorá sa nachádza na rozmedzí strojového učenia a dolovania dát na jednej strane a modelovania a analýzy procesov na druhej strane. Cieľom dolovania procesov je popísať a analyzovať skutočné procesy extrahovaním znalostí z protokolov udalostí, ktoré sú v dnešných aplikáciách bežne dostupné. Táto práca mieri na spojenie obchodných príležitostí (organizácie bohaté na dáta; dopyt po službách BPM; limitácie na strane tradičnej dodávky BPM služieb) s technickými možnosťammi Dolovania procesov. Cieľom práce je návrh produktu, ktorý bude riešiť potreby zákazníkov a poskytovateľov služieb v oblasti Dolovania procesov lepšie než súčasné riešenie vybranej spoločnosti.The software and hardware applications record more and more information into the event logs. The amount of data recorded is more than doubled every two years. Process mining is a relatively young discipline that sits between machine learning and data mining on the one hand and process modeling and analysis on the other hand. The goal of process mining is to describe and analyze real processes by extracting knowledge from the event logs readily available in today’s systems. This thesis aims to connect the business opportunities (i.e., data-rich organizations; need for BPM services; limitation in traditional delivery of BPM services) with technical possibilities of Process mining. The goal is to propose a product solving needs and demands of stakeholders (i.e., customers and consultants) better than select existing solution.

    Prosessilouhintamallin luominen normaalimuutosprosessin tueksi

    Get PDF
    Liiketoimintaympäristö muuttuu ja yritykset pyrkivät pysymään muutosten mukana kehittämällä liiketoimintaprosesseja. Tietojärjestelmät tuottavat paljon tietoa, jonka hyödyntäminen on vielä vähäistä. Prosessilouhinta mahdollistaa tietojärjestelmien tuottaman tiedon käyttämisen liiketoimintaprosessien kehittämisessä. Tämä tutkimus on syntynyt kohdeyrityksen tarpeesta saada lisää läpinäkyvyyttä ja parantaa tiedolla johtamista IT-palvelunhallintaan kuuluvan muutoksenhallinnan normaalimuutosprosessissa. Tutkimus seuraa suunnittelututkimusprosessin viitekehystä. Tutkimuksen tavoitteena oli selvittää prosessilouhinnan edellytykset ja mahdollisuudet normaalimuutosprosessin tueksi. Tutkimuksessa suunniteltiin, luotiin ja arvioitiin artefakti, joka tässä tapauksessa oli prosessilouhintamalli. Prosessilouhintamallin tavoitteena oli lisätä tutkittavan prosessin läpinäkyvyyttä ja kehittää tiedolla johtamista. Prosessilouhintamallin suunnittelussa oli keskeistä tunnistaa lähdejärjestelmästä tarvittavat tiedot ja luoda tietoihin pohjautuva tapahtumaloki. Tapahtumaloki oli edellytys prosessilouhinnan suorittamiselle. Tutkimuksessa käytettiin lähdejärjestelmän kehittäjäympäristöstä saatuja tietoja, koska tuotantoympäristöön ei saatu tarvittavia käyttöoikeuksia tutkimuksen aikarajojen puitteissa. Kehittäjäympäristön tiedot olivat heikkolaatuisia, jotka aiheuttivat artefaktin arvioinnissa haasteita. Prosessilouhintamallin arviointi tapahtui haastattelemalla kohdeyrityksen muutoksenhallintaprosessin avainhenkilöitä. Tutkimuksen tuloksena onnistuttiin luomaan prosessilouhintamalli sekä yleisesti syventämään kohdeyrityksen ymmärrystä ja osaamista prosessilouhinnasta. Luotu prosessilouhintamalli lisäsi prosessin läpinäkyvyyttä esittämällä prosessin aikaiset tapahtumat. Tapauksia pystytään tarkastelemaan kokonaisuutena ja yksittäisinä. Päätöksenteon taustalla on enemmän tietoa, kun prosessilouhintamallia käytetään. Näin ollen tiedolla johtaminen kehittyy ja kohdeyritys pystyy tunnistamaan ongelmakohdat tarkemmin, joihin resurssit voidaan kohdentaa tehokkaasti. Tutkimuksen avulla pystyttiin myös kasvattamaan kohdeyrityksen ymmärrystä teknologian mahdollisuuksista. Prosessilouhinnan laajempi käyttöönotto koettiin mahdolliseksi kohdeyrityksessä. Haasteiksi koettiin lisäresurssien tarve ja muutoksen merkittävyys. Prosessilouhinnan käyttöönotto vie aikaa ja vaatii sitoutumista koko yritykseltä. Tutkimusprosessia seuraten kohdeyritys voi tarkastella muitakin prosesseja. Tutkimuksesta on hyötyä myös muille yrityksille, jotka ovat kiinnostuneita, miten toteuttaa yksi tunnistettu käyt-tötapaus. Seuraavana askeleena kohdeyritykselle on käyttää tuotantoympäristön tietoja, joiden avulla kehittää luotua prosessilouhintamallia ja analysoimalla prosessia syvällisesti. Lisäksi kohdeyritys voi alkaa laajentamaan käyttöönottoa muihin prosesseihin.Business environment is changing, and companies try to keep up with change by developing business processes. Information systems produce lots of information which is not greatly utilised. Process mining enables the usage of data produced by systems in business process development. This research has originated from the target company’s needs to increase transparency and enhance knowledge management in change management’s normal change process. The research utilises design science research framework. The objective of the research was to investigate requirements and opportunities of process mining to support normal change process. In the research an artefact was designed, created, and evaluated which was a process mining model in this case. The objective of the process mining model was to increase transparency of the process and develop knowledge management. The essential part of designing process mining model was to identify required data from the source system and create an event log based on the data. Event log was required to perform process mining. Data from source system’s development environment was used in the research because access rights to production environment was not granted during the research timeframe. Development environment data was low quality which caused challenges during artefact’s evaluation phase. Evaluating the process mining model was conducted by interviewing target company’s key personnel in change management process. As a result of the research process mining model was successfully created and overall target company’s process mining knowledge was enhanced. The created process mining model increased transparency of the process by illustrating the activities during the process. Cases can be examined as a group and individually. There is more information available to support decision making. Therefore, knowledge management is evolving, and target company can efficiently identify problems where resources can be allocated. Target company was also able to gain more understanding about the opportunities of the technology. Extensive implementation of process mining was considered possible in target company. Perceived challenges to implementation are need for additional resources and magnitude of the change. Implementing process mining takes time and requires commitment from the whole company. By using the conducted research process target company can examine other processes as well. The research is useful for other companies which are interested to learn how specific use case was implemented. Next step for the target company is to use data from production environment to develop the created process mining model and analyse the process profoundly. In addition, implementation of process mining can be extended to other processes

    Conformance checking and diagnosis in process mining

    Get PDF
    In the last decades, the capability of information systems to generate and record overwhelming amounts of event data has experimented an exponential growth in several domains, and in particular in industrial scenarios. Devices connected to the internet (internet of things), social interaction, mobile computing, and cloud computing provide new sources of event data and this trend will continue in the next decades. The omnipresence of large amounts of event data stored in logs is an important enabler for process mining, a novel discipline for addressing challenges related to business process management, process modeling, and business intelligence. Process mining techniques can be used to discover, analyze and improve real processes, by extracting models from observed behavior. The capability of these models to represent the reality determines the quality of the results obtained from them, conditioning its usefulness. Conformance checking is the aim of this thesis, where modeled and observed behavior are analyzed to determine if a model defines a faithful representation of the behavior observed a the log. Most of the efforts in conformance checking have focused on measuring and ensuring that models capture all the behavior in the log, i.e., fitness. Other properties, such as ensuring a precise model (not including unnecessary behavior) have been disregarded. The first part of the thesis focuses on analyzing and measuring the precision dimension of conformance, where models describing precisely the reality are preferred to overly general models. The thesis includes a novel technique based on detecting escaping arcs, i.e., points where the modeled behavior deviates from the one reflected in log. The detected escaping arcs are used to determine, in terms of a metric, the precision between log and model, and to locate possible actuation points in order to achieve a more precise model. The thesis also presents a confidence interval on the provided precision metric, and a multi-factor measure to assess the severity of the detected imprecisions. Checking conformance can be time consuming for real-life scenarios, and understanding the reasons behind the conformance mismatches can be an effort-demanding task. The second part of the thesis changes the focus from the precision dimension to the fitness dimension, and proposes the use of decomposed techniques in order to aid in checking and diagnosing fitness. The proposed approach is based on decomposing the model into single entry single exit components. The resulting fragments represent subprocesses within the main process with a simple interface with the rest of the model. Fitness checking per component provides well-localized conformance information, aiding on the diagnosis of the causes behind the problems. Moreover, the relations between components can be exploded to improve the diagnosis capabilities of the analysis, identifying areas with a high degree of mismatches, or providing a hierarchy for a zoom-in zoom-out analysis. Finally, the thesis proposed two main applications of the decomposed approach. First, the theory proposed is extended to incorporate data information for fitness checking in a decomposed manner. Second, a real-time event-based framework is presented for monitoring fitness.En las últimas décadas, la capacidad de los sistemas de información para generar y almacenar datos de eventos ha experimentado un crecimiento exponencial, especialmente en contextos como el industrial. Dispositivos conectados permanentemente a Internet (Internet of things), redes sociales, teléfonos inteligentes, y la computación en la nube proporcionan nuevas fuentes de datos, una tendencia que continuará en los siguientes años. La omnipresencia de grandes volúmenes de datos de eventos almacenados en logs abre la puerta al Process Mining (Minería de Procesos), una nueva disciplina a caballo entre las técnicas de gestión de procesos de negocio, el modelado de procesos, y la inteligencia de negocio. Las técnicas de minería de procesos pueden usarse para descubrir, analizar, y mejorar procesos reales, a base de extraer modelos a partir del comportamiento observado. La capacidad de estos modelos para representar la realidad determina la calidad de los resultados que se obtengan, condicionando su efectividad. El Conformance Checking (Verificación de Conformidad), objetivo final de esta tesis, permite analizar los comportamientos observados y modelados, y determinar si el modelo es una fiel representación de la realidad. La mayoría de los esfuerzos en Conformance Checking se han centrado en medir y asegurar que los modelos fueran capaces de capturar todo el comportamiento observado, también llamado "fitness". Otras propiedades, tales como asegurar la "precisión" de los modelos (no modelar comportamiento innecesario) han sido relegados a un segundo plano. La primera parte de esta tesis se centra en analizar la precisión, donde modelos describiendo la realidad con precisión son preferidos a modelos demasiado genéricos. La tesis presenta una nueva técnica basada en detectar "arcos de escape", i.e. puntos donde el comportamiento modelado se desvía del comportamiento reflejado en el log. Estos arcos de escape son usados para determinar, en forma de métrica, el nivel de precisión entre un log y un modelo, y para localizar posibles puntos de mejora. La tesis también presenta un intervalo de confianza sobre la métrica, así como una métrica multi-factorial para medir la severidad de las imprecisiones detectadas. Conformance Checking puede ser una operación costosa para escenarios reales, y entender las razones que causan los problemas requiere esfuerzo. La segunda parte de la tesis cambia el foco (de precisión a fitness), y propone el uso de técnicas de descomposición para ayudar en la verificación de fitness. Las técnicas propuestas se basan en descomponer el modelo en componentes con una sola entrada y una sola salida, llamados SESEs. Estos componentes representan subprocesos dentro del proceso principal. Verificar el fitness a nivel de subproceso proporciona una información detallada de dónde están los problemas, ayudando en su diagnóstico. Además, las relaciones entre subprocesos pueden ser explotadas para mejorar las capacidades de diagnóstico e identificar qué áreas concentran la mayor densidad de problemas. Finalmente, la tesis propone dos aplicaciones directas de las técnicas de descomposición: 1) la teoría es extendida para incluir información de datos a la verificación de fitness, y 2) el uso de sistemas descompuestos en tiempo real para monitorizar fitnes
    corecore