101,748 research outputs found

    Decision-enabled dynamic process management for networked enterprises

    Get PDF
    In todays networked economy face numerous information management challenges, both from a process management perspective as well as a decision support perspective. While there have been significant relevant advances in the areas of business process management as well as decision sciences, several open research issues exist. In this paper, we highlight the following key challenges. First, current process modeling and management techniques lack in providing a seamless integration of decision models and tools in existing business processes, which is critical to achieve organizational objectives. Second, given the dynamic nature of business processes in networked enterprises, process management approaches that enable organizations to react to business process changes in an agile manner are required. Third, current state-of-the-art decision model management techniques are not particularly amenable to distributed settings in networked enterprises, which limits the sharing and reuse of models in different contexts, including their utility within managing business processes. In this paper, we present a framework for decision-enabled dynamic process management that addresses these challenges. The framework builds on computational formalisms, including the structured modeling paradigm for representing decision models, and hierarchical task networks from the artificial intelligence (AI) planning area for process modeling. Within the framework, interleaved process planning (modeling), execution and monitoring for dynamic process management throughout the process lifecycle is proposed. A service-oriented architecture combined with advances from the semantic Web field for model management support within business processes is proposed

    Cybersecurity, Artificial Intelligence, and Risk Management: Understanding Their Implementation in Military Systems Acquisitions

    Get PDF
    Excerpt from the Proceedings of the Nineteenth Annual Acquisition Research SymposiumThis research has the explicit goal of proposing a reusable, extensible, adaptable, and comprehensive advanced analytical modeling process to help the U.S. Navy in quantifying, modeling, valuing, and optimizing a set of nascent Artificial Intelligence and Machine Learning (AI/ML) applications in the aerospace, automotive and transportation industries and developing a framework with a hierarchy of functions by technology category and developing a unique-to-Navy-ship construct that, based on weighted criteria, scores the return on investment of developing naval AI/ML applications that enhance warfighting capabilities. This current research proposes to create a business case for making strategic decisions under uncertainty. Specifically, we will look at a portfolio of nascent artificial intelligence and machine learning applications, both at the PEO-SHIPS and extensible to the Navy Fleet. This portfolio of options approach to business case justification will provide tools to allow decision-makers to decide on the optimal flexible options to implement and allocate in different types of artificial intelligence and machine learning applications, subject to budget constraints, across multiple types of ships. The concept of the impact of innovative technology on productivity has applicability beyond the Department of Defense (DoD). Private industry can greatly benefit from the concepts and methodologies developed in this research to apply to the hiring and talent management of scientists, programmers, engineers, analysts, and senior executives in the workforce to increase innovation productivity.Approved for public release; distribution is unlimited

    Cybersecurity, Artificial Intelligence, and Risk Management: Understanding Their Implementation in Military Systems Acquisitions

    Get PDF
    Excerpt from the Proceedings of the Nineteenth Annual Acquisition Research SymposiumThis research has the explicit goal of proposing a reusable, extensible, adaptable, and comprehensive advanced analytical modeling process to help the U.S. Navy in quantifying, modeling, valuing, and optimizing a set of nascent Artificial Intelligence and Machine Learning (AI/ML) applications in the aerospace, automotive and transportation industries and developing a framework with a hierarchy of functions by technology category and developing a unique-to-Navy-ship construct that, based on weighted criteria, scores the return on investment of developing naval AI/ML applications that enhance warfighting capabilities. This current research proposes to create a business case for making strategic decisions under uncertainty. Specifically, we will look at a portfolio of nascent artificial intelligence and machine learning applications, both at the PEO-SHIPS and extensible to the Navy Fleet. This portfolio of options approach to business case justification will provide tools to allow decision-makers to decide on the optimal flexible options to implement and allocate in different types of artificial intelligence and machine learning applications, subject to budget constraints, across multiple types of ships. The concept of the impact of innovative technology on productivity has applicability beyond the Department of Defense (DoD). Private industry can greatly benefit from the concepts and methodologies developed in this research to apply to the hiring and talent management of scientists, programmers, engineers, analysts, and senior executives in the workforce to increase innovation productivity.Approved for public release; distribution is unlimited

    Artifact Lifecycle Discovery

    Get PDF
    Artifact-centric modeling is a promising approach for modeling business processes based on the so-called business artifacts - key entities driving the company's operations and whose lifecycles define the overall business process. While artifact-centric modeling shows significant advantages, the overwhelming majority of existing process mining methods cannot be applied (directly) as they are tailored to discover monolithic process models. This paper addresses the problem by proposing a chain of methods that can be applied to discover artifact lifecycle models in Guard-Stage-Milestone notation. We decompose the problem in such a way that a wide range of existing (non-artifact-centric) process discovery and analysis methods can be reused in a flexible manner. The methods presented in this paper are implemented as software plug-ins for ProM, a generic open-source framework and architecture for implementing process mining tools

    Planning and Scheduling of Business Processes in Run-Time: A Repair Planning Example

    Get PDF
    Over the last decade, the efficient and flexible management of business processes has become one of the most critical success aspects. Furthermore, there exists a growing interest in the application of Artificial Intelligence Planning and Scheduling techniques to automate the production and execution of models of organization. However, from our point of view, several connections between both disciplines remains to be exploited. The current work presents a proposal for modelling and enacting business processes that involve the selection and order of the activities to be executed (planning), besides the resource allocation (scheduling), considering the optimization of several functions and the reach of some objectives. The main novelty is that all decisions (even the activities selection) are taken in run-time considering the actual parameters of the execution, so the business process is managed in an efficient and flexible way. As an example, a complex and representative problem, the repair planning problem, is managed through the proposed approach.Ministerio de Ciencia e Innovación TIN2009-13714Junta de Andalucía P08-TIC-0409

    A System for Deduction-based Formal Verification of Workflow-oriented Software Models

    Full text link
    The work concerns formal verification of workflow-oriented software models using deductive approach. The formal correctness of a model's behaviour is considered. Manually building logical specifications, which are considered as a set of temporal logic formulas, seems to be the significant obstacle for an inexperienced user when applying the deductive approach. A system, and its architecture, for the deduction-based verification of workflow-oriented models is proposed. The process of inference is based on the semantic tableaux method which has some advantages when compared to traditional deduction strategies. The algorithm for an automatic generation of logical specifications is proposed. The generation procedure is based on the predefined workflow patterns for BPMN, which is a standard and dominant notation for the modeling of business processes. The main idea for the approach is to consider patterns, defined in terms of temporal logic,as a kind of (logical) primitives which enable the transformation of models to temporal logic formulas constituting a logical specification. Automation of the generation process is crucial for bridging the gap between intuitiveness of the deductive reasoning and the difficulty of its practical application in the case when logical specifications are built manually. This approach has gone some way towards supporting, hopefully enhancing our understanding of, the deduction-based formal verification of workflow-oriented models.Comment: International Journal of Applied Mathematics and Computer Scienc

    Business Process Configuration According to Data Dependency Specification

    Get PDF
    Configuration techniques have been used in several fields, such as the design of business process models. Sometimes these models depend on the data dependencies, being easier to describe what has to be done instead of how. Configuration models enable to use a declarative representation of business processes, deciding the most appropriate work-flow in each case. Unfortunately, data dependencies among the activities and how they can affect the correct execution of the process, has been overlooked in the declarative specifications and configurable systems found in the literature. In order to find the best process configuration for optimizing the execution time of processes according to data dependencies, we propose the use of Constraint Programming paradigm with the aim of obtaining an adaptable imperative model in function of the data dependencies of the activities described declarative.Ministerio de Ciencia y Tecnología TIN2015-63502-C3-2-RFondo Europeo de Desarrollo Regiona

    Tasks, cognitive agents, and KB-DSS in workflow and process management

    Get PDF
    The purpose of this paper is to propose a nonparametric interest rate term structure model and investigate its implications on term structure dynamics and prices of interest rate derivative securities. The nonparametric spot interest rate process is estimated from the observed short-term interest rates following a robust estimation procedure and the market price of interest rate risk is estimated as implied from the historical term structure data. That is, instead of imposing a priori restrictions on the model, data are allowed to speak for themselves, and at the same time the model retains a parsimonious structure and the computational tractability. The model is implemented using historical Canadian interest rate term structure data. The parametric models with closed form solutions for bond and bond option prices, namely the Vasicek (1977) and CIR (1985) models, are also estimated for comparison purpose. The empirical results not only provide strong evidence that the traditional spot interest rate models and market prices of interest rate risk are severely misspecified but also suggest that different model specifications have significant impact on term structure dynamics and prices of interest rate derivative securities.

    A planning approach to the automated synthesis of template-based process models

    Get PDF
    The design-time specification of flexible processes can be time-consuming and error-prone, due to the high number of tasks involved and their context-dependent nature. Such processes frequently suffer from potential interference among their constituents, since resources are usually shared by the process participants and it is difficult to foresee all the potential tasks interactions in advance. Concurrent tasks may not be independent from each other (e.g., they could operate on the same data at the same time), resulting in incorrect outcomes. To tackle these issues, we propose an approach for the automated synthesis of a library of template-based process models that achieve goals in dynamic and partially specified environments. The approach is based on a declarative problem definition and partial-order planning algorithms for template generation. The resulting templates guarantee sound concurrency in the execution of their activities and are reusable in a variety of partially specified contextual environments. As running example, a disaster response scenario is given. The approach is backed by a formal model and has been tested in experiment
    corecore