56 research outputs found

    決定木学習を利用したビジネスプロセス実行ログ検証のための論理式の生成

    Get PDF
    情報システムによって記録されたビジネスプロセスの実行履歴を分析することはプロセスマイニングと呼ばれ,実際に行われたビジネスプロセスの問題を把握して改善へ繋げるための重要な手段である.LTL checkerは線形時相論理(LTL)をベースにした形式的な言語を利用してビジネスプロセスにおいて成り立つべき性質を記述し,検証を行うためのツールであり,ビジネスプロセスの分析を行うための有力な手段として知られている.しかし,多くのビジネスアナリストはLTLのような数学的な記法に精通していないため,ビジネスプロセスにおいて検証したい性質を記述する際に,真に検証すべき性質を正確に記述することは困難である.論理式を誤って記述した場合は当然のことながら本来意図していた検証を行うことはできない.そこで本研究では教師あり機械学習手法の一種である決定木を用いてビジネスプロセス実行ログからイベントの実行順序関係に着目して抽出した特徴量に基づいて学習を行い,論理式を自動生成することで検証したい性質を記述する手法を提案する.本手法を用いることで,数学的な手法に精通していない者でも検証すべき性質を記述することができる.本手法の妥当性を示すために,電話修理プロセスに対し提案手法を適用し,有効性を確認した.Process mining is a important means for analyzing business process and LTL checker is a famous tool for process mining. However, since many business analysts are not familiar with mathematical notation like LTL, it is difficult to describe exactly the property to be verified when describing the property to be verified in the business process is there. Therefore, in this study, learning is performed based on feature quantities extracted from the business process execution log using a decision tree, and a logical expression is automatically generated. We propose a method to describe properties to be verified

    ゴール指向要求分析に基づくビジネスプロセスの構築と検証に関する研究

    Get PDF
     情報システムは様々な企業や官公庁で利用されており,業務を支援している.このような状況では,実際の業務において真に有用な情報システムを開発するためには,情報システムの開発とビジネスプロセスの設計をそれぞれ独立して行うのではなく,組織の目標を達成するためのビジネスプロセスを設計し,それに合わせてビジネスプロセスの実行を効率的に支援するための情報システムを構築する必要がある.これらの設計・構築は要求を体系的・論理的に記述できるゴールモデルや,ビジネスプロセスの流れを記述できるビジネスプロセスモデルを用いることで,効果的に行うことができる.しかし,設計時において前提としていた組織を取り巻く環境は法律の改正や市場の変化等の理由によって変化するため,情報システムやビジネスプロセスは1度構築するだけでは十分ではなく,継続的に現環境において適切なものとなっているのかを検証し,不適切であれば改善する必要がある.また,このように複雑で変化する環境においては,情報システムやビジネスプロセスに求められる要件定義を行うことは難しい. 上記のような問題に対処するためには,環境変化が発生しているか確認するために,情報システムの実行ログが望ましい性質を満たしているか検証する技術や,組織の目標やビジネスプロセスに関するモデルを効率的に構築する技術が必要であり,研究が行われているが依然困難である.既存研究においては,実行ログの分析手法については,一般的に時相論理によって成り立つべき性質や成り立つべきでない性質を記述して検証を行うが,時相論理の記述は数理論理学の知識が不足している者やドメイン知識が不足している場合においては,正確に記述することが難しいという問題がある.また,モデルの構築については,組織を取り巻く様々な側面を記述した複数のモデルの整合性がとれた状態で構築する手法が不十分である. 本研究で提案するアプローチはこれらの課題の解決を目指し,以下の2 つの内容に取り組んだ.:(1) ゴール指向要求分析手法KAOS によるゴールモデルからビジネスプロセスモデルを導出する手法,(2) 決定木を利用したビジネスプロセス実行ログの検証支援手法.これらを用いることで,要求を的確にビジネスプロセスに反映すること,実行されたビジネスプロセスの問題点を把握することができる.これらの提案手法はロンドンにおける救急車配備システムや電話の修理プロセス等を題材にケーススタディを行いそれぞれ2 つの提案手法について評価し,有効性を確認できた.電気通信大学201

    What's next? : operational support for business process execution

    Get PDF
    In the last decade flexibility has become an increasingly important in the area of business process management. Information systems that support the execution of the process are required to work in a dynamic environment that imposes changing demands on the execution of the process. In academia and industry a variety of paradigms and implementations has been developed to support flexibility. While on the one hand these approaches address the industry demands in flexibility, on the other hand, they result in confronting the user with many choices between different alternatives. As a consequence, methods to support users in selecting the best alternative during execution have become essential. In this thesis we introduce a formal framework for providing support to users based on historical evidence available in the execution log of the process. This thesis focuses on support by means of (1) recommendations that provide the user an ordered list of execution alternatives based on estimated utilities and (2) predictions that provide the user general statistics for each execution alternative. Typically, estimations are not an average over all observations, but they are based on observations for "similar" situations. The main question is what similarity means in the context of business process execution. We introduce abstractions on execution traces to capture similarity between execution traces in the log. A trace abstraction considers some trace characteristics rather than the exact trace. Traces that have identical abstraction values are said to be similar. The challenge is to determine those abstractions (characteristics) that are good predictors for the parameter to be estimated in the recommendation or prediction. We analyse the dependency between values of an abstraction and the mean of the parameter to be estimated by means of regression analysis. With regression we obtain a set of abstractions that explain the parameter to be estimated. Dependencies do not only play a role in providing predictions and recommendations to instances at run-time, but they are also essential for simulating the effect of changes in the environment on the processes, both locally and globally. We use stochastic simulation models to simulate the effect of changes in the environment, in particular changed probability distribution caused by recommendations. The novelty of these models is that they include dependencies between abstraction values and simulation parameters, which are estimated from log data. We demonstrate that these models give better approximations of reality than traditional models. A framework for offering operational support has been implemented in the context of the process mining framework ProM

    Business Process Quality Management

    Get PDF
    During the past 25 years, research in the field of business process management as well as the practical adoption of corresponding methods and tools have made substantial progress. In particular, this development was driven by the insight that well-managed business processes enable organizations to better serve their stakeholders, save costs and, ultimately, realize competitive advantage. It is therefore not surprising that improving business processes ranks high on the list of priorities of organizations. In practice, this challenge is currently being addressed through approaches such as benchmarking, industry-specific best practice reference models or process reengineering heuristics. However, no systematic and generic proposition towards managing business process quality has achieved broad acceptance yet. To address this gap, this thesis contributes to the field of business process quality management with the results lined out in the following. First, it defines a concise notion of business process quality based on organizational targets, and applies it to a sample real-world case. This definition is not specific to any particular application field, and thus constitutes a vital first step towards systematic and generic business process quality management. On that basis, an approach is developed to model business objectives in the sense of the requirements that shall be fulfilled by the results of a business process. In turn, this approach enables appraising if a business process achieves its business objective as one of the core criteria relevant to business process quality. Further, this thesis proposes extensions to common business process meta-models which enable quality-aware business process modeling, and demonstrates how fundamental quality characteristics can be derived from corresponding models. At this stage, the results achieved have enabled an advanced understanding of business process quality. By means of these insights, a model of business process quality attributes with corresponding quality criteria is developed. This model complements and exceeds preceding approaches since, for the first time, it systematically derives relevant quality attributes from a business process management perspective instead of adopting these from related fields. It enables appraising business process quality independently of a particular field of application, and deriving recommendations to improve the processes assessed. To enable practical adoption of the concepts developed, the integration of procedures and functionality relevant to quality in business process management lifecycles and system landscapes is discussed next. To establish the contribution of this thesis beyond the previous state of the art, the proposed quality model is then compared to existing business process reengineering practices as well as propositions in the area of business process quality. Further, quality attributes are employed to improve a substantial real-world business process. This experience report demonstrates how quality management practices can be applied even if quality-aware system landscapes are not in place yet. It thus contributes to bridging the gap between the research results proposed in this thesis and the conditions present in practice today. Finally, remaining limitations with regard to the research objectives pursued are discussed, and challenges for future research are lined out. Addressing the latter will enable further leveraging the potentials of business process quality management

    The application of process mining to care pathway analysis in the NHS

    Get PDF
    Background: Prostate cancer is the most common cancer in men in the UK and the sixth-fastest increasing cancer in males. Within England survival rates are improving, however, these are comparatively poorer than other countries. Currently, information available on outcomes of care is scant and there is an urgent need for techniques to improve healthcare systems and processes. Aims: To provide prostate cancer pathway analysis, by applying concepts of process mining and visualisation and comparing the performance metrics against the standard pathway laid out by national guidelines. Methods: A systematic review was conducted to see how process mining has been used in healthcare. Appropriate datasets for prostate cancer were identified within Imperial College Healthcare NHS Trust London. A process model was constructed by linking and transforming cohort data from six distinct database sources. The cohort dataset was filtered to include patients who had a PSA from 2010-2015, and validated by comparing the medical patient records against a Case-note audit. Process mining techniques were applied to the data to analyse performance and conformance of the prostate cancer pathway metrics to national guideline metrics. These techniques were evaluated with stakeholders to ascertain its impact on user experience. Results: Case note audit revealed 90% match against patients found in medical records. Application of process mining techniques showed massive heterogeneity as compared to the homogenous path laid out by national guidelines. This also gave insight into bottlenecks and deviations in the pathway. Evaluation with stakeholders showed that the visualisation and technology was well accepted, high quality and recommended to be used in healthcare decision making. Conclusion: Process mining is a promising technique used to give insight into complex and flexible healthcare processes. It can map the patient journey at a local level and audit it against explicit standards of good clinical practice, which will enable us to intervene at the individual and system level to improve care.Open Acces

    Combining SOA and BPM Technologies for Cross-System Process Automation

    Get PDF
    This paper summarizes the results of an industry case study that introduced a cross-system business process automation solution based on a combination of SOA and BPM standard technologies (i.e., BPMN, BPEL, WSDL). Besides discussing major weaknesses of the existing, custom-built, solution and comparing them against experiences with the developed prototype, the paper presents a course of action for transforming the current solution into the proposed solution. This includes a general approach, consisting of four distinct steps, as well as specific action items that are to be performed for every step. The discussion also covers language and tool support and challenges arising from the transformation

    Bringing social reality to multiagent and service architectures : practical reductions for monitoring of deontic-logic and constitutive norms

    Get PDF
    As distributed systems grow in complexity, the interactions among individuals (agents, services) of such systems become increasingly more complex and therefore more difficult to constrain and monitor. We propose to view such systems as socio-technical systems, in which organisational and institutional concepts, such as norms, can be applied to improve not only control on the components but also their autonomy by the definition of soft rather than hard constraints. Norms can be described as rules that guide the behavior of individual agents pertaining to groups that abide to them, either by explicit or implicit support. The study of norms, and regulatory systems in general, in their many forms -e.g. social norms, conventions, laws, regulations- has been of interest since the beginning of philosophy, but has seen a lot of evolution during the 20th century due to the progress in the philosophy of language, especially concerning speech acts and deontic logic. Although there is a myriad of definitions and related terminologies about the concept of norm, and as such there are many perspectives on how to analyse their impact, a common denominator is that norms constrain the behaviour of groups of agents in a way that each individual agent can build, with a fair degree of confidence, expectations on how each of their counterparts will behave in the situations that the norms are meant to cover. For example, on a road each driver expects everybody else to drive on only one side of the road (right or left, depending on the country). Therefore, normative contexts, usually wrapped in the form of institutions, are effective mechanisms to ensure the stability of a complex system such as an organisation, a society, or even of electronic systems. The latter has been an object of interest in the field of Artificial Intelligence, and it has been seen as a paradigm of coordination among electronic agents either in multi-agent systems or in service-oriented architectures. In order to apply norms to electronic systems, research has come up with abstractions of normative systems. In some cases these abstractions are based on regimented systems with flexible definitions of the notion of norm, in order to include meanings of the concept with a coarse-grained level of logic formality such as conventions. Other approaches, on the other hand, propose the use of deontic logic for describing, from a more theoretical perspective, norm-governed interaction environments. In both cases, the purpose is to enable the monitoring and enforcement of norms on systems that include -although not limited to- electronic agents. In the present dissertation we will focus on the latter type, focusing on preserving the deontic aspect of norms. Monitoring in norm-governed systems requires making agents aware of: 1) what their normative context is, i.e. which obligations, permissions and prohibitions are applicable to each of them and how they are updated and triggered; and 2) what their current normative status is, i.e. which norms are active, and in what instances they are being fullfilled or violated, in order words, what their social -institutional- reality is. The current challenge is on designing systems that allow computational components to infer both the normative context and social reality in real-time, based on a theoretical formalism that makes such inferences sound and correct from a philosophical perspective. In the scope of multi-agent systems, many are the approaches proposed and implemented that full these requirements up to this date. However, the literature is still lacking a proposal that is suited to the current state-of-the-art in service-oriented architectures, more focused nowadays on automatically scalable, polyglot amalgams of lightweight services with extremely simple communication and coordination mechanisms- a trend that is being called “microservices”. This dissertation tackles this issue, by 1) studying what properties we can infer from distributed systems that allow us to treat them as part of a socio-technical system, and 2) analysing which mechanisms we can provide to distributed systems so that they can properly act as socio-technical systems. The main product of the thesis is therefore a collection of computational elements required for formally grounded and real-time e¬fficient understanding and monitoring of normative contexts, more specially: 1. An ontology of events to properly model the inputs from the external world and convert them into brute facts or institutional events; 2. A lightweight language for norms, suitable for its use in distributed systems; 3. An especially tailored formalism for the detection of social reality, based on and reducible to deontic logic with support for constitutive norms; 4. A reduction of such formalism to production rule systems; and 5. One or more implementations of this reduction, proven to e¬fficiently work on several scenarios. This document presents the related work, the rationale and the design/implementation of each one of these elements. By combining them, we are able to present novel, relevant work that enables the application of normative reasoning mechanisms in realworld systems in the form of a practical reasoner. Of special relevance is the fact that the work presented in this dissertation simplifies, while preserving formal soundness, theoretically complex forms of reasoning. Nonetheless, the use of production systems as the implementation-level materialisation of normative monitoring allows our work to be applied in any language and/or platform available, either in the form of rule engines, ECA rules or even if-then-else patterns. The work presented has been tested and successfully used in a wide range of domains and actual applications. The thesis also describes how our mechanisms have been applied to practical use cases based on their integration into distributed eldercare management and to commercial games.Con el incremento en la complejidad de los sistemas distribuidos, las interacciones entre los individuos (agentes, servicios) de dichos sistemas se vuelven más y más complejas y, por ello, más difíciles de restringir y monitorizar. Proponemos ver a estos sistemas como sistemas socio-técnicos, en los que conceptos organizacionales e institucionales (como las normas) pueden aplicarse para mejorar no solo el control sobre los componentes sino también su autonomía mediante la definición de restricciones débiles (en vez de fuertes). Las Normas se pueden describir como reglas que guían el comportamiento de agentes individuales que pertenecen a grupos que las siguen, ya sea con un apoyo explícito o implícito. El estudio de las normas y de los sistemas regulatorios en general y en sus formas diversas -normas sociales, convenciones, leyes, reglamentos- ha sido de interés para los eruditos desde los inicios de la filosofía, pero ha sufrido una evolución mayor durante el siglo 20 debido a los avances en filosofía del lenguaje, en especial los relacionados con los actos del habla -speech acts en inglés- y formas deónticas de la lógica modal. Aunque hay una gran variedad de definiciones y terminología asociadas al concepto de norma, y por ello existen varios puntos de vista sobre como analizar su impacto, el denominador común es que las normas restringen el comportamiento de grupos de agentes de forma que cada agente individual puede construir, con un buen nivel de confianza, expectativas sobre cómo cada uno de los otros actores se comportará en las situaciones que las normas han de cubrir. Por ejemplo, en una carretera cada conductor espera que los demás conduzcan solo en un lado de la carretera (derecha o izquierda, dependiendo del país). Por lo tanto, los contextos normativos, normalmente envueltos en la forma de instituciones, constituyen mecanismos efectivos para asegurar la estabilidad de un sistema complejo como una organización, una sociedad o incluso un sistema electrónico. Lo último ha sido objeto de estudio en el campo de la Inteligencia Artificial, y se ha visto como paradigma de coordinación entre agentes electrónicos, tanto en sistemas multiagentes como en arquitecturas orientadas a servicios. Para aplicar normas en sistemas electrónicos, los investigadores han creado abstracciones de sistemas normativos. En algunos casos estas abstracciones se basan en sistemas regimentados con definiciones flexibles del concepto de norma para poder influir algunos significados del concepto con un menor nivel de granularidad formal como es el caso de las convenciones. Otras aproximaciones proponen el uso de lógica deóntica para describir, desde un punto de vista más teórico, entornos de interacción gobernados por normas. En ambos casos el propósito es el permitir la monitorización y la aplicación de las normas en sistemas que incluyen -aunque no están limitados a- agentes electrónicos. En el presente documento nos centraremos en el segundo tipo, teniendo cuidado en mantener el aspecto deóntico de las normas. La monitorización en sistemas gobernados por normas requiere el hacer a los agentes conscientes de: 1) cual es su contexto normativo, es decir, que obligaciones permisos y prohibiciones se aplican a cada uno de ellos y cómo se actualizan y activan; y 2) cual es su estado normativo actual, esto es, que normas están activas, y que instancias están siendo cumplidas o violadas, en definitiva, cual es su realidad social -o institucional-. En la actualidad el reto consiste en diseñar sistemas que permiten inferir a componentes computacionales tanto el contexto normativo como la realidad social en tiempo real, basándose en un formalismo teórico que haga que dichas inferencias sean correctas y bien fundamentadas desde el punto de vista filosófico. En el ámbito de los sistemas multiagente existen muchas aproximaciones propuestas e implementadas que cubren estos requisitos. Sin embargo, esta literatura aun carece de una propuesta que sea adecuada para la tecnología de las arquitecturas orientadas a servicios, que están más centradas en amalgamas políglotas y escalables de servicios ligeros con mecanismos de coordinación y comunicación extremadamente simples, una tendencia moderna que lleva el nombre de microservicios. Esta tesis aborda esta problemática 1) estudiando que propiedades podemos inferir de los sistemas distribuidos que nos permitan tratarlos como parte de un sistema sociotécnico, y 2) analizando que mecanismos podemos proporcionar a los sistemas distribuidos de forma que puedan actuar de forma correcta como sistemas socio-técnicos. El producto principal de la tesis es, por tanto, una colección de elementos computacionales requeridos para la monitorización e interpretación e_cientes en tiempo real y con clara base formal. En concreto: 1. Una ontología de eventos para modelar adecuadamente las entradas del mundo exterior y convertirlas en hechos básicos o en eventos institucionales; 2. Un lenguaje de normas ligero y sencillo, adecuado para su uso en arquitecturas orientadas a servicios; 3. Un formalismo especialmente adaptado para la detección de la realidad social, basado en y reducible a lógica deóntica con soporte para normas constitutivas; 4. Una reducción de ese formalismo a sistemas de reglas de producción; y 5. Una o más implementaciones de esta reducción, de las que se ha probado que funcionan eficientemente en distintos escenarios. Este documento presenta el estado del arte relacionado, la justificación y el diseño/implementación para cada uno de esos elementos. Al combinarlos, somos capaces de presentar trabajo novedoso y relevante que permite la aplicación de mecanismos de razonamiento normativo en sistemas del mundo real bajo la forma de un razonador práctico. De especial relevancia es el hecho de que el trabajo presentado en este documento simplifica formas complejas y teóricas de razonamiento preservando la correctitud formal. El uso de sistemas de reglas de producción como la materialización a nivel de implementación del monitoreo normativo permite que nuestro trabajo se pueda aplicar a cualquier lenguaje o plataforma disponible, ya sea en la forma de motores de reglas, reglas ECA o incluso patrones si-entonces. El trabajo presentado ha sido probado y usado con éxito en un amplio rango de dominios y aplicaciones prácticas. La tesis describe como nuestros mecanismos se han aplicado a casos prácticos de uso basados en su integración en la gestión distribuida de pacientes de edad avanzada o en el sector de los videojuegos comerciales.Postprint (published version

    Automated Deduction – CADE 28

    Get PDF
    This open access book constitutes the proceeding of the 28th International Conference on Automated Deduction, CADE 28, held virtually in July 2021. The 29 full papers and 7 system descriptions presented together with 2 invited papers were carefully reviewed and selected from 76 submissions. CADE is the major forum for the presentation of research in all aspects of automated deduction, including foundations, applications, implementations, and practical experience. The papers are organized in the following topics: Logical foundations; theory and principles; implementation and application; ATP and AI; and system descriptions
    corecore