69 research outputs found

    Decomposing conformance checking on Petri nets with data

    Get PDF
    Process mining techniques relate observed behavior to modeled behavior, e.g., the automatic discovery of a Petri net based on an event log. Process mining is not limited to process discovery and also includes conformance checking. Conformance checking techniques are used for evaluating the quality of discovered process models and to diagnose deviations from some normative model (e.g., to check compliance). Existing conformance checking approaches typically focus on the control flow, thus being unable to diagnose deviations concerning data. This paper proposes a technique to check the conformance of data-aware process models. We use so-called "data Petri nets" to model data variables, guards, and read/write actions. Additional perspectives such as resource allocation and time constraints can be encoded in terms of variables. Data-aware conformance checking problem may be very time consuming and sometimes even intractable when there are many transitions and data variables. Therefore, we propose a technique to decompose large data-aware conformance checking problems into smaller problems that can be solved more efficiently. We provide a general correctness result showing that decomposition does not influence the outcome of conformance checking. Moreover, two decomposition strategies are presented. The approach is supported through ProM plug-ins and experimental results show that significant performance improvements are indeed possible

    Proceedings of the First NASA Formal Methods Symposium

    Get PDF
    Topics covered include: Model Checking - My 27-Year Quest to Overcome the State Explosion Problem; Applying Formal Methods to NASA Projects: Transition from Research to Practice; TLA+: Whence, Wherefore, and Whither; Formal Methods Applications in Air Transportation; Theorem Proving in Intel Hardware Design; Building a Formal Model of a Human-Interactive System: Insights into the Integration of Formal Methods and Human Factors Engineering; Model Checking for Autonomic Systems Specified with ASSL; A Game-Theoretic Approach to Branching Time Abstract-Check-Refine Process; Software Model Checking Without Source Code; Generalized Abstract Symbolic Summaries; A Comparative Study of Randomized Constraint Solvers for Random-Symbolic Testing; Component-Oriented Behavior Extraction for Autonomic System Design; Automated Verification of Design Patterns with LePUS3; A Module Language for Typing by Contracts; From Goal-Oriented Requirements to Event-B Specifications; Introduction of Virtualization Technology to Multi-Process Model Checking; Comparing Techniques for Certified Static Analysis; Towards a Framework for Generating Tests to Satisfy Complex Code Coverage in Java Pathfinder; jFuzz: A Concolic Whitebox Fuzzer for Java; Machine-Checkable Timed CSP; Stochastic Formal Correctness of Numerical Algorithms; Deductive Verification of Cryptographic Software; Coloured Petri Net Refinement Specification and Correctness Proof with Coq; Modeling Guidelines for Code Generation in the Railway Signaling Context; Tactical Synthesis Of Efficient Global Search Algorithms; Towards Co-Engineering Communicating Autonomous Cyber-Physical Systems; and Formal Methods for Automated Diagnosis of Autosub 6000

    Conformance Checking of Large Process Model: An Approach based on Decomposition

    Get PDF
    Conformance checking is the problem to pinpoint deviations between how processes are executed in reality and how processes are expected to be performed according to norms, regulations and protocols. The executions are recorded in event logs, while the expected behaviors are encoded in a process model. The complexity of the problem is exponential with respect to the size of the model, this makes the problem not scale when models become very large. To keep the problem tractable, one can decompose the model into parts for which conformance checking is carried out

    HIERARCHICAL HYBRID-MODEL BASED DESIGN, VERIFICATION, SIMULATION, AND SYNTHESIS OF MISSION CONTROL FOR AUTONOMOUS UNDERWATER VEHICLES

    Get PDF
    The objective of modeling, verification, and synthesis of hierarchical hybrid mission control for underwater vehicle is to (i) propose a hierarchical architecture for mission control for an autonomous system, (ii) develop extended hybrid state machine models for the mission control, (iii) use these models to verify for logical correctness, (iv) check the feasibility of a simulation software to model the mission executed by an autonomous underwater vehicle (AUV) (v) perform synthesis of high-level mission coordinators for coordinating lower-level mission controllers in accordance with the given mission, and (vi) suggest further design changes for improvement. The dissertation describes a hierarchical architecture in which mission level controllers based on hybrid systems theory have been, and are being developed using a hybrid systems design tool that allows graphical design, iterative redesign, and code generation for rapid deployment onto the target platform. The goal is to support current and future autonomous underwater vehicle (AUV) programs to meet evolving requirements and capabilities. While the tool facilitates rapid redesign and deployment, it is crucial to include safety and performance verification into each step of the (re)design process. To this end, the modeling of the hierarchical hybrid mission controller is formalized to facilitate the use of available tools and newly developed methods for formal verification of safety and performance specifications. A hierarchical hybrid architecture for mission control of autonomous systems with application to AUVs is proposed and a theoretical framework for the models that make up the architecture is outlined. An underwater vehicle like any other autonomous system is a hybrid system, as the dynamics of the vehicle as well as its vehicle level control is continuous whereas the mission level control is discrete, making the overall system a hybrid system i.e., one possessing both continuous and discrete states. The hybrid state machine models of the mission controller modules is derived from their implementation done using TEJA, a software for representing hybrid systems with support for auto code generation. The verification of their logical correctness properties has been done using UPPAAL, a software tool for verification of timed automata a special kind of hybrid system. A Teja to Uppaal converter, called dem2xml, has been created at Applied Reserarch Lab that converts a hybrid (timed) autonomous system description in Teja to an Uppaal system description. Verification work involved developing abstract models for the lower level vehicle controllers with which the mission controller modules interact and follow a hierarchical approach: Assuming the correctness of level-zero or vehicle controllers, we establish the correctness of level-one mission controller modules, and then the correctness of level-two modules, etc. The goal of verification is to show that any valid meaning for a mission formalized in our research verifies the safe and correct execution of actions. Simulation of the sequence of actions executed for each of the operations give a better view of the combined working of the mission coordinators and the low level controllers. So we next looked into the feasibility of simulating the operations executed during a mission. A Perl program has been developed to convert the UPPAAL files in .xml format to OpenGL graphic files. The graphic files simulate the steps involved in the execution of a sequence of operations executed by an AUV. The highest level coordinators send mission orders to be executed by the lower level controllers. So a more generalized design of the highest level controllers would help to incorporate the execution of a variety of missions for a vast field of applications. Initially, we consider manually synthesized mission coordinator modules. Later we design automated synthesis of coordinators. This method synthesizes mission coordinators which coordinate the lower level controllers for the execution of the missions ordered and can be used for any autonomous system

    Logic programming for deliberative robotic task planning

    Get PDF
    Over the last decade, the use of robots in production and daily life has increased. With increasingly complex tasks and interaction in different environments including humans, robots are required a higher level of autonomy for efficient deliberation. Task planning is a key element of deliberation. It combines elementary operations into a structured plan to satisfy a prescribed goal, given specifications on the robot and the environment. In this manuscript, we present a survey on recent advances in the application of logic programming to the problem of task planning. Logic programming offers several advantages compared to other approaches, including greater expressivity and interpretability which may aid in the development of safe and reliable robots. We analyze different planners and their suitability for specific robotic applications, based on expressivity in domain representation, computational efficiency and software implementation. In this way, we support the robotic designer in choosing the best tool for his application

    Conformance checking and diagnosis in process mining

    Get PDF
    In the last decades, the capability of information systems to generate and record overwhelming amounts of event data has experimented an exponential growth in several domains, and in particular in industrial scenarios. Devices connected to the internet (internet of things), social interaction, mobile computing, and cloud computing provide new sources of event data and this trend will continue in the next decades. The omnipresence of large amounts of event data stored in logs is an important enabler for process mining, a novel discipline for addressing challenges related to business process management, process modeling, and business intelligence. Process mining techniques can be used to discover, analyze and improve real processes, by extracting models from observed behavior. The capability of these models to represent the reality determines the quality of the results obtained from them, conditioning its usefulness. Conformance checking is the aim of this thesis, where modeled and observed behavior are analyzed to determine if a model defines a faithful representation of the behavior observed a the log. Most of the efforts in conformance checking have focused on measuring and ensuring that models capture all the behavior in the log, i.e., fitness. Other properties, such as ensuring a precise model (not including unnecessary behavior) have been disregarded. The first part of the thesis focuses on analyzing and measuring the precision dimension of conformance, where models describing precisely the reality are preferred to overly general models. The thesis includes a novel technique based on detecting escaping arcs, i.e., points where the modeled behavior deviates from the one reflected in log. The detected escaping arcs are used to determine, in terms of a metric, the precision between log and model, and to locate possible actuation points in order to achieve a more precise model. The thesis also presents a confidence interval on the provided precision metric, and a multi-factor measure to assess the severity of the detected imprecisions. Checking conformance can be time consuming for real-life scenarios, and understanding the reasons behind the conformance mismatches can be an effort-demanding task. The second part of the thesis changes the focus from the precision dimension to the fitness dimension, and proposes the use of decomposed techniques in order to aid in checking and diagnosing fitness. The proposed approach is based on decomposing the model into single entry single exit components. The resulting fragments represent subprocesses within the main process with a simple interface with the rest of the model. Fitness checking per component provides well-localized conformance information, aiding on the diagnosis of the causes behind the problems. Moreover, the relations between components can be exploded to improve the diagnosis capabilities of the analysis, identifying areas with a high degree of mismatches, or providing a hierarchy for a zoom-in zoom-out analysis. Finally, the thesis proposed two main applications of the decomposed approach. First, the theory proposed is extended to incorporate data information for fitness checking in a decomposed manner. Second, a real-time event-based framework is presented for monitoring fitness.En las últimas décadas, la capacidad de los sistemas de información para generar y almacenar datos de eventos ha experimentado un crecimiento exponencial, especialmente en contextos como el industrial. Dispositivos conectados permanentemente a Internet (Internet of things), redes sociales, teléfonos inteligentes, y la computación en la nube proporcionan nuevas fuentes de datos, una tendencia que continuará en los siguientes años. La omnipresencia de grandes volúmenes de datos de eventos almacenados en logs abre la puerta al Process Mining (Minería de Procesos), una nueva disciplina a caballo entre las técnicas de gestión de procesos de negocio, el modelado de procesos, y la inteligencia de negocio. Las técnicas de minería de procesos pueden usarse para descubrir, analizar, y mejorar procesos reales, a base de extraer modelos a partir del comportamiento observado. La capacidad de estos modelos para representar la realidad determina la calidad de los resultados que se obtengan, condicionando su efectividad. El Conformance Checking (Verificación de Conformidad), objetivo final de esta tesis, permite analizar los comportamientos observados y modelados, y determinar si el modelo es una fiel representación de la realidad. La mayoría de los esfuerzos en Conformance Checking se han centrado en medir y asegurar que los modelos fueran capaces de capturar todo el comportamiento observado, también llamado "fitness". Otras propiedades, tales como asegurar la "precisión" de los modelos (no modelar comportamiento innecesario) han sido relegados a un segundo plano. La primera parte de esta tesis se centra en analizar la precisión, donde modelos describiendo la realidad con precisión son preferidos a modelos demasiado genéricos. La tesis presenta una nueva técnica basada en detectar "arcos de escape", i.e. puntos donde el comportamiento modelado se desvía del comportamiento reflejado en el log. Estos arcos de escape son usados para determinar, en forma de métrica, el nivel de precisión entre un log y un modelo, y para localizar posibles puntos de mejora. La tesis también presenta un intervalo de confianza sobre la métrica, así como una métrica multi-factorial para medir la severidad de las imprecisiones detectadas. Conformance Checking puede ser una operación costosa para escenarios reales, y entender las razones que causan los problemas requiere esfuerzo. La segunda parte de la tesis cambia el foco (de precisión a fitness), y propone el uso de técnicas de descomposición para ayudar en la verificación de fitness. Las técnicas propuestas se basan en descomponer el modelo en componentes con una sola entrada y una sola salida, llamados SESEs. Estos componentes representan subprocesos dentro del proceso principal. Verificar el fitness a nivel de subproceso proporciona una información detallada de dónde están los problemas, ayudando en su diagnóstico. Además, las relaciones entre subprocesos pueden ser explotadas para mejorar las capacidades de diagnóstico e identificar qué áreas concentran la mayor densidad de problemas. Finalmente, la tesis propone dos aplicaciones directas de las técnicas de descomposición: 1) la teoría es extendida para incluir información de datos a la verificación de fitness, y 2) el uso de sistemas descompuestos en tiempo real para monitorizar fitnes

    Proceedings of The Multi-Agent Logics, Languages, and Organisations Federated Workshops (MALLOW 2010)

    Get PDF
    http://ceur-ws.org/Vol-627/allproceedings.pdfInternational audienceMALLOW-2010 is a third edition of a series initiated in 2007 in Durham, and pursued in 2009 in Turin. The objective, as initially stated, is to "provide a venue where: the cost of participation was minimum; participants were able to attend various workshops, so fostering collaboration and cross-fertilization; there was a friendly atmosphere and plenty of time for networking, by maximizing the time participants spent together"

    Across frequency processes involved in auditory detection of coloration

    Get PDF

    The perceptual flow of phonetic feature processing

    Get PDF
    corecore