17 research outputs found

    Cloud-Based Data Analytics on Human Factor Measurement to Improve Safer Transport

    Get PDF
    Improving safer transport includes individual and collective behavioural aspects and their interaction. A system that can monitor and evaluate the human cognitive and physical capacities based on human factor measurement is often beneficial to improve safety in driving condition. However, analysis and evaluation of human factor measurement i.e. demographics, behaviour and physiology in real-time is challenging. This paper presents a methodology for cloud-based data analysis, categorization and metrics correlation in real-time through a H2020 project called SimuSafe. Initial implementation of this methodology shows a step-by-step approach which can handle huge amount of data with variation and verity in the cloud

    Ontological representation and governance of business semantics in compliant service networks

    Get PDF
    The Internet would enable new ways for service innovation and trading, as well as for analysing the resulting value networks, with an unprecedented level of scale and dynamics. Yet most related economic activities remain of a largely brittle and manual nature. Service-oriented business implementations focus on operational aspects at the cost of value creation aspects such as quality and regulatory compliance. Indeed they enforce how to carry out a certain business in a prefixed non-adaptive manner rather than capturing the semantics of a business domain in a way that would enable service systems to adapt their role in changing value propositions. In this paper we set requirements for SDL-compliant business service semantics, and propose a method for their ontological representation and governance. We demonstrate an implementation of our approach in the context of service-oriented Information Governance

    Decomposed process mining with DivideAndConquer

    No full text
    Many known process mining techniques scale badly in the number of activities in an event log. Examples of such techniques include the ILP Miner and the standard replay, which also uses ILP techniques. To alleviate the problems these techniques face, we can decompose a large problem (with many activities) into a number of small problems (with few activities). Expectation is, that the run times of such a decomposed setting will be faster than the run time of the original setting. This paper presents the DivideAndConquer tool, which allows the user to decompose a large problem into small problems, to run the desired discovery or replay technique on each of these decomposed problems, and to merge the results into a single result, which can then be shown to the user

    Ontology of Dynamic Entities

    No full text

    Process and deviation exploration with Inductive visual Miner

    No full text
    Process mining aims to extract information from recorded process data, which can be used to gain insights into the process. This requires applying a discovery algorithm and settings its parameters, after which the discovered process model should be evaluated. Both steps may need to be repeated several times until a satisfying model is found; we refer to this as process exploration. Existing commercial tools usually do not provide models having executable semantics, thereby disallowing for accurate map evaluation, while most academic tools lack features and by the repetitive nature of process exploration, their use is tedious. In this paper, we describe a novel process exploration tool: the Inductive visual Miner . It aims to bridge this gap between commercial and academic tools, by combining the executable semantics of academic tools with the exploration support of commercial tools. It also adds animation and deviation visualisation capabilities. Keywords: Process mining, process exploration, deviation analysi

    Supporting process mining workflows with RapidProM

    No full text
    Process mining is gaining more and more attention both in industry and practice. As such, the number of process mining products is steadily increasing. However, none of these products allow for composing and executing analysis work flows consisting of multiple process mining algorithms. As a result, the analyst needs to perform repetitive process mining tasks manually and scientific process experiments are extremely labor intensive. To this end, we have RapidMiner 5, which allows for the definition and execution of analysis work flows, connected with the process mining framework ProM 6. As such any discovery, conformance, or extension algorithm of ProM can be used within a RapidMiner analysis process thus supporting process mining work flows

    The FeaturePrediction package in ProM : correlating business process characteristics

    No full text
    In Process Mining, often one is not only interested in learning process models but also in answering questions such as "What do the cases that are late have in common?", "What characterizes the workers that skip this check activity?" and "Do people work faster if they have more work?". Such questions can be answered by combining process mining with classification (e.g., decision tree analysis). Several authors have proposed ad-hoc solutions for specific questions, e.g., there is work on predicting the remaining processing time and recommending activities to minimize particular risks. This paper reports on a tool, implemented as plug-in for ProM, that unifies these ideas and provide a general framework for deriving and correlating process characteristics. To demonstrate the maturity of the tool, we show the steps with the tool to answer one correlation question related to a health-care process. The answer to a second question is shown in the screencast accompanying this paper

    Discovering, analyzing and enhancing BPMN models using ProM

    No full text
    Process mining techniques relate observed behavior to modeled behavior, e.g., the automatic discovery of a process model based on an event log. Process mining is not limited to process discovery and also includes conformance checking and model enhancement. Conformance checking techniques are used to diagnose the deviations of the observed behavior as recorded in the event log from some process model. Model enhancement allows to extend process models using additional perspectives, conformance and performance information. In recent years, BPMN (Business Process Model and Notation) 2.0 has become a de facto standard for modeling business processes in industry. This paper presents the BPMN support current in ProM. ProM is the most known and used open-source process mining framework. ProM’s functionalities of discovering, analyzing and enhancing BPMN models are discussed. Support of the BPMN 2.0 standard will help ProM users to bridge the gap between formal models (such as Petri nets, causal nets and others) and process models used by practitioners

    Data streams in ProM 6 : a single-node architecture

    No full text
    Process mining is an active field of research that primarily builds upon data mining and process model-driven analysis. Within the field, static data is typically used. The usage of dynamic and/or volatile data (i.e. real-time streaming data) is very limited. Current process mining techniques are in general not able to cope with challenges posed by real-time data. Hence new approaches that enable us to apply process mining on such data are an interesting new field of study. The ProM-framework that supports a variety of researchers and domain experts in the field has therefore been extended with support for data-streams. This paper gives an overview of the newly created extension that lays a foundation for integrating streaming environments with ProM. Additionally a case study is presented in which a real-life online data stream has been incorporated in a basic ProM-based analysis
    corecore