22 research outputs found

    BPELStats

    Get PDF
    Calculating business process metrics and correlating them to quality attributes is an important part of empirical research in the area of business process modeling. BPEL is a standardized modeling language for executable business processes for which many metrics have been proposed. However, no publication offers a reference implementation. This leads to two main problems: (a) A barrier to use metrics and (b) reproducibility problems, when reimplementing the metrics for each empirical study. Across several of our research projects we developed BPELStats, which implements many BPEL metrics and is readily available for all researchers and other interested parties to use

    Approaches to Compute Workflow Complexity

    Get PDF
    During the last 20 years, complexity has been an interesting topic that has been investigated in many fields of science, such as biology, neurology, software engineering, chemistry, psychology, and economy. A survey of the various approaches to understand complexity has lead sometimes to a measurable quantity with a rigorous but narrow definition and other times as merely an ad hoc label. In this paper we investigate the complexity concept to avoid a vague use of the term `complexity\u27 in workflow designs. We present several complexity metrics that have been used for a number of years in adjacent fields of science and explain how they can be adapted and use to evaluate the complexity of workflows

    Towards data-aware resource analysis for service orchestrations

    Get PDF
    Compile-time program analysis techniques can be applied to Web service orchestrations to prove or check various properties. In particular, service orchestrations can be subjected to resource analysis, in which safe approximations of upper and lower resource usage bounds are deduced. A uniform analysis can be simultaneously performed for different generalized resources that can be directiy correlated with cost- and performance-related quality attributes, such as invocations of partners, network traffic, number of activities, iterations, and data accesses. The resulting safe upper and lower bounds do not depend on probabilistic assumptions, and are expressed as functions of size or length of data components from an initiating message, using a finegrained structured data model that corresponds to the XML-style of information structuring. The analysis is performed by transforming a BPEL-like representation of an orchestration into an equivalent program in another programming language for which the appropriate analysis tools already exist

    AUTOMATED PLANNING OF PROCESS MODELS: THE CONSTRUCTION OF SIMPLE MERGES

    Get PDF
    Business processes evolve dynamically with changing business demands. Because of these fast changes, traditional process improvement techniques have to be adapted and extended since they often require a high degree of manual work. To reduce this degree of manual work, the automated planning of process models is proposed. In this context, we present a novel approach for an automated construction of the control flow structure simple merge (XOR join). This accounts for a necessary step towards an auto-mated planning of entire process models. Here we build upon a planning domain, which gives us a general and formal basis to apply our approach independently from a specific process modeling lan-guage. To analyze the feasibility of our method, we mathematically evaluate the approach in terms of key properties like termination and completeness. Moreover, we implement the approach in a process planning software and apply it to several real-world processes

    Towards data-aware cost-driven adaptation for service orchestrations.

    Get PDF
    Several activities in service oriented computing, such as automatic composition, monitoring, and adaptation, can benefit from knowing properties of a given service composition before executing them. Among these properties we will focus on those related to execution cost and resource usage, in a wide sense, as they can be linked to QoS characteristics. In order to attain more accuracy, we formulate execution costs / resource usage as functions on input data (or appropriate abstractions thereof) and show how these functions can be used to make better, more informed decisions when performing composition, adaptation, and proactive monitoring. We present an approach to, on one hand, synthesizing these functions in an automatic fashion from the definition of the different orchestrations taking part in a system and, on the other hand, to effectively using them to reduce the overall costs of non-trivial service-based systems featuring sensitivity to data and possibility of failure. We validate our approach by means of simulations of scenarios needing runtime selection of services and adaptation due to service failure. A number of rebinding strategies, including the use of cost functions, are compared

    SOA based web service adaptation in enterprise application integration

    Get PDF
    Enterprise Application Integration (EAI) is a permanent need since various information systems are employed at companies. Numerous standard systems must be aligned to new business processes. There are participant systems older than 10 years, and others developed only 1-2 years ago. This implicates a wide technological variance making the integration problem a real challenging issue. The widespread of the Service Oriented Architecture (SOA) seems to be one of the most promising approaches in EAI. Although this is already supported by solid technology and tools, deploying executable processes, predicting and optimizing their non-functional performance is still an open issue. In this paper we propose a technological solution for the adaptation of standard enterprise services into SOA integration scenarios providing support for applying data transformation to bridge data incompatibilities. To evaluate our approach three other possible solutions are designed and implemented. An in detailed analytic and experimenta l comparison of the approaches is also presented

    Fuzzy Rule Based Approach for Quality Analysis of Web Service Composition Using Complexity Metrics

    Get PDF
    Since the human needs are fast changing, the present day software tends to be complex. So, complexity analysis of any software is the one of the challenging areas of research. In the literature review, a good number of articles are available on traditional software complexity analysis; but the complexity analysis of service oriented architecture based software is not studied extensively till date. The web service is the basic building block of SOA. Composition of web service is done through a Business Process Execution Language; but a large number of web service compositions make the software more complex. So, it is necessary to analyze the complexity of BPEL processes. Business activities govern long-running complex composed service. That reduces the service reliability, performability, and others quality attributes. Business process complexity metrics are considered for analysis of composed web service. In this work different complexity metrics are proposed and Fuzzy logic is used for quality analysis of web service composition. This model relates business complexity metrics such as activity complexity, structural complexity, control ow complexity to high-level quality attributes such as functionality, usability, maintainability, reliability, performability using fuzzy rule-based approach

    Blockchain Technology Application Maturity Assessment Model for Digital Government Public Service Projects

    Get PDF
    With the deepening application of blockchain technology, exaggerating its empowering effects has become common. In recent years, the rational assessment of the maturity of blockchain technology applications in digital projects in different fields has been the focus of attention and identified as the key to improving the implementation effect of various digital projects. Although some studies have obtained substantial research results on technology maturity and its derivative applications, which can be used to predict the overall trend of a technology or guide the implementation of the technology on the ground, few studies have evaluated the maturity of blockchain technology in combination with different application scenarios. Our study combines application scenarios and the technical characteristics of blockchain technology and proposes an evaluation system for blockchain technology application maturity consisting of five primary indicators, that is, key application requirements, data security, process complexity, application ecological completeness, and technical performance requirements, and their corresponding secondary indicators. In addition, we take digital government public service projects as application scenarios and use the analytic hierarchy process (AHP) entropy method and expert scoring method to determine the weights corresponding to each index in the assessment system and construct a blockchain technology application maturity assessment model. Moreover, we apply the model to ten typical digital government public service projects to conduct a comprehensive assessment and analysis. By comparing the indicator scores of the different projects, we analyze the project characteristics influencing blockchain technology application maturity and provide suggestions for applying “blockchain + digital government public services”
    corecore