1,760 research outputs found
A Taxonomy of Workflow Management Systems for Grid Computing
With the advent of Grid and application technologies, scientists and
engineers are building more and more complex applications to manage and process
large data sets, and execute scientific experiments on distributed resources.
Such application scenarios require means for composing and executing complex
workflows. Therefore, many efforts have been made towards the development of
workflow management systems for Grid computing. In this paper, we propose a
taxonomy that characterizes and classifies various approaches for building and
executing workflows on Grids. We also survey several representative Grid
workflow systems developed by various projects world-wide to demonstrate the
comprehensiveness of the taxonomy. The taxonomy not only highlights the design
and engineering similarities and differences of state-of-the-art in Grid
workflow systems, but also identifies the areas that need further research.Comment: 29 pages, 15 figure
Elastic Business Process Management: State of the Art and Open Challenges for BPM in the Cloud
With the advent of cloud computing, organizations are nowadays able to react
rapidly to changing demands for computational resources. Not only individual
applications can be hosted on virtual cloud infrastructures, but also complete
business processes. This allows the realization of so-called elastic processes,
i.e., processes which are carried out using elastic cloud resources. Despite
the manifold benefits of elastic processes, there is still a lack of solutions
supporting them.
In this paper, we identify the state of the art of elastic Business Process
Management with a focus on infrastructural challenges. We conceptualize an
architecture for an elastic Business Process Management System and discuss
existing work on scheduling, resource allocation, monitoring, decentralized
coordination, and state management for elastic processes. Furthermore, we
present two representative elastic Business Process Management Systems which
are intended to counter these challenges. Based on our findings, we identify
open issues and outline possible research directions for the realization of
elastic processes and elastic Business Process Management.Comment: Please cite as: S. Schulte, C. Janiesch, S. Venugopal, I. Weber, and
P. Hoenisch (2015). Elastic Business Process Management: State of the Art and
Open Challenges for BPM in the Cloud. Future Generation Computer Systems,
Volume NN, Number N, NN-NN., http://dx.doi.org/10.1016/j.future.2014.09.00
A lightweight, flow-based toolkit for parallel and distributed bioinformatics pipelines
<p>Abstract</p> <p>Background</p> <p>Bioinformatic analyses typically proceed as chains of data-processing tasks. A pipeline, or 'workflow', is a well-defined protocol, with a specific structure defined by the topology of data-flow interdependencies, and a particular functionality arising from the data transformations applied at each step. In computer science, the dataflow programming (DFP) paradigm defines software systems constructed in this manner, as networks of message-passing components. Thus, bioinformatic workflows can be naturally mapped onto DFP concepts.</p> <p>Results</p> <p>To enable the flexible creation and execution of bioinformatics dataflows, we have written a modular framework for parallel pipelines in Python ('PaPy'). A PaPy workflow is created from re-usable components connected by data-pipes into a directed acyclic graph, which together define nested higher-order map functions. The successive functional transformations of input data are evaluated on flexibly pooled compute resources, either local or remote. Input items are processed in batches of adjustable size, all flowing one to tune the trade-off between parallelism and lazy-evaluation (memory consumption). An add-on module ('NuBio') facilitates the creation of bioinformatics workflows by providing domain specific data-containers (<it>e.g</it>., for biomolecular sequences, alignments, structures) and functionality (<it>e.g</it>., to parse/write standard file formats).</p> <p>Conclusions</p> <p>PaPy offers a modular framework for the creation and deployment of parallel and distributed data-processing workflows. Pipelines derive their functionality from user-written, data-coupled components, so PaPy also can be viewed as a lightweight toolkit for extensible, flow-based bioinformatics data-processing. The simplicity and flexibility of distributed PaPy pipelines may help users bridge the gap between traditional desktop/workstation and grid computing. PaPy is freely distributed as open-source Python code at <url>http://muralab.org/PaPy</url>, and includes extensive documentation and annotated usage examples.</p
Enabling Flexibility in Process-Aware Information Systems: Challenges, Methods, Technologies
In today’s dynamic business world, the success of a company increasingly depends on its ability to react to changes in its environment in a quick and flexible way. Companies have therefore identified process agility as a competitive advantage to address business trends like increasing product and service variability or faster time to market, and to ensure business IT alignment. Along this trend, a new generation of information systems has emerged—so-called process-aware information systems (PAIS), like workflow management systems, case handling tools, and service orchestration engines.
With this book, Reichert and Weber address these flexibility needs and provide an overview of PAIS with a strong focus on methods and technologies fostering flexibility for all phases of the process lifecycle (i.e., modeling, configuration, execution and evolution). Their presentation is divided into six parts. Part I starts with an introduction of fundamental PAIS concepts and establishes the context of process flexibility in the light of practical scenarios. Part II focuses on flexibility support for pre-specified processes, the currently predominant paradigm in the field of business process management (BPM). Part III details flexibility support for loosely specified processes, which only partially specify the process model at build-time, while decisions regarding the exact specification of certain model parts are deferred to the run-time. Part IV deals with user- and data-driven processes, which aim at a tight integration of processes and data, and hence enable an increased flexibility compared to traditional PAIS. Part V introduces existing technologies and systems for the realization of a flexible PAIS. Finally, Part VI summarizes the main ideas of this book and gives an outlook on advanced flexibility issues.
The attached pdf file gives a preview on Chapter 3 of the book which explains the book's overall structure
IT supported business process negotiation, reconciliation and execution for cross-organisational e-business collaboration
In modern enterprises, workflow technology is commonly used for business process
automation. Established business processes represent successful business practice and
become a crucial part of corporate assets. In the Internet era, electronic business is chosen
by more and more organisations as a preferred way of conducting business practice. In
response to the increasing demands for cross-organisational business automation, especially
those raised by the B2B electronic commerce community, the concept of collaboration
between automated business processes, i.e. workflow collaboration, is emerging. Otherwise,
automation would be confined within individual organisations and cross-organisational
collaboration would still have to be carried out manually.
However, much of the previous research work overlooks the acquisition of the compatible
workflows at build time and simply assumes that compatibility is achieved through face-toface
negotiation followed by a design from scratch approach that creates collaborative
workflows based on the agreement resulted from the negotiation. The resource-intensive and
error-prone approach can hardly keep up with the pace of today’s marketplace with
increasing transaction volume and complexity.
This thesis identifies the requirements for cross-organisational workflow collaboration
(COWCO) through an integrated approach, proposes a comprehensive supporting
framework, explains the key enabling techniques of the framework, and implements and
evaluates them in the form of a prototype system – COWCO-Guru. With the support of such
a framework, cross-organisational workflow collaboration can be managed and conducted
with reduced human effort, which will further facilitate cross-organisational e-business,
especially B2B e-commerce practices
Flexible runtime support of business processes under rolling planning horizons
This work has been motivated by the needs we discovered when analyzing real-world processes from the
healthcare domain that have revealed high flexibility demands and complex temporal constraints. When trying
to model these processes with existing languages, we learned that none of the latter was able to fully address
these needs. This motivated us to design TConDec-R, a declarative process modeling language enabling the
specification of complex temporal constraints. Enacting business processes based on declarative process models,
however, introduces a high complexity due to the required optimization of objective functions, the handling of
various temporal constraints, the concurrent execution of multiple process instances, the management of crossinstance
constraints, and complex resource allocations. Consequently, advanced user support through optimized
schedules is required when executing the instances of such models. In previous work, we suggested a method for
generating an optimized enactment plan for a given set of process instances created from a TConDec-R model.
However, this approach was not applicable to scenarios with uncertain demands in which the enactment of
newly created process instances starts continuously over time, as in the considered healthcare scenarios. Here,
the process instances to be planned within a specific timeframe cannot be considered in isolation from the ones
planned for future timeframes. To be able to support such scenarios, this article significantly extends our previous
work by generating optimized enactment plans under a rolling planning horizon. We evaluate the approach
by applying it to a particularly challenging healthcare process scenario, i.e., the diagnostic procedures required
for treating patients with ovarian carcinoma in a Woman Hospital. The application of the approach to this sophisticated
scenario allows avoiding constraint violations and effectively managing shared resources, which
contributes to reduce the length of patient stays in the hospital.Ministerio de Economía y Competitividad TIN2016-76956-C3-2-RMinisterio de Ciencia e Innovación PID2019-105455 GB-C3
Enabling quantitative data analysis through e-infrastructures
This paper discusses how quantitative data analysis in the social sciences can engage with and exploit an e-Infrastructure. We highlight how a number of activities which are central to quantitative data analysis, referred to as ‘data management’, can benefit from e-infrastructure support. We conclude by discussing how these issues are relevant to the DAMES (Data Management through e-Social Science) research Node, an ongoing project that aims to develop e-Infrastructural resources for quantitative data analysis in the social sciences
- …