2,669 research outputs found

    A Taxonomy of Workflow Management Systems for Grid Computing

    Full text link
    With the advent of Grid and application technologies, scientists and engineers are building more and more complex applications to manage and process large data sets, and execute scientific experiments on distributed resources. Such application scenarios require means for composing and executing complex workflows. Therefore, many efforts have been made towards the development of workflow management systems for Grid computing. In this paper, we propose a taxonomy that characterizes and classifies various approaches for building and executing workflows on Grids. We also survey several representative Grid workflow systems developed by various projects world-wide to demonstrate the comprehensiveness of the taxonomy. The taxonomy not only highlights the design and engineering similarities and differences of state-of-the-art in Grid workflow systems, but also identifies the areas that need further research.Comment: 29 pages, 15 figure

    Data-Flow Modeling: A Survey of Issues and Approaches

    Get PDF
    This paper presents a survey of previous research on modeling the data flow perspective of business processes. When it comes to modeling and analyzing business process models the current research focuses on control flow modeling (i.e. the activities of the process) and very little attention is paid to the data-flow perspective. But data is essential in a process. In order to execute a workflow, the tasks need data. Without data or without data available on time, the control flow cannot be executed. For some time, various researchers tried to investigate the data flow perspective of process models or to combine the control and data flow in one model. This paper surveys those approaches. We conclude that there is no model showing a clear data flow perspective focusing on how data changes during a process execution. The literature offers some similar approaches ranging from data modeling using elements from relational database domain, going through process model verification and ending with elements related to Web Services

    INTEGRATING BUSINESS PROCESS CONCEPTS INTO UML ACTIVITY MODEL

    Get PDF
    The Unified Modeling Language (UML) activity model is widely accepted for modeling processes from pure technical views. Because it's rich high level design language and supported by standard body; OMG which involve most of top worldwide IT players. It turns out in modeling information systems, business processes are main component so needs suitable modeling language such as Business Process Model and Notation (BPMN) that is originated in organization domain. However, learning new language has the cost of learning curve. Since UML activity model has commonality with BPMN as both share the core principles of process modeling, this paper proposes sustainable change to UML activity model by introducing business concepts so technical modelers can speak with UML activity a different language. This synergistic relationship not only doubles the benefit of UML activity and reduces the learning curve, but also highlight the differences that add value to editor providers. A light weight extension or profile has been designed and evaluated using a real case stud

    A Process Modelling Framework Based on Point Interval Temporal Logic with an Application to Modelling Patient Flows

    Get PDF
    This thesis considers an application of a temporal theory to describe and model the patient journey in the hospital accident and emergency (A&E) department. The aim is to introduce a generic but dynamic method applied to any setting, including healthcare. Constructing a consistent process model can be instrumental in streamlining healthcare issues. Current process modelling techniques used in healthcare such as flowcharts, unified modelling language activity diagram (UML AD), and business process modelling notation (BPMN) are intuitive and imprecise. They cannot fully capture the complexities of the types of activities and the full extent of temporal constraints to an extent where one could reason about the flows. Formal approaches such as Petri have also been reviewed to investigate their applicability to the healthcare domain to model processes. Additionally, to schedule patient flows, current modelling standards do not offer any formal mechanism, so healthcare relies on critical path method (CPM) and program evaluation review technique (PERT), that also have limitations, i.e. finish-start barrier. It is imperative to specify the temporal constraints between the start and/or end of a process, e.g., the beginning of a process A precedes the start (or end) of a process B. However, these approaches failed to provide us with a mechanism for handling these temporal situations. If provided, a formal representation can assist in effective knowledge representation and quality enhancement concerning a process. Also, it would help in uncovering complexities of a system and assist in modelling it in a consistent way which is not possible with the existing modelling techniques. The above issues are addressed in this thesis by proposing a framework that would provide a knowledge base to model patient flows for accurate representation based on point interval temporal logic (PITL) that treats point and interval as primitives. These objects would constitute the knowledge base for the formal description of a system. With the aid of the inference mechanism of the temporal theory presented here, exhaustive temporal constraints derived from the proposed axiomatic system’ components serves as a knowledge base. The proposed methodological framework would adopt a model-theoretic approach in which a theory is developed and considered as a model while the corresponding instance is considered as its application. Using this approach would assist in identifying core components of the system and their precise operation representing a real-life domain deemed suitable to the process modelling issues specified in this thesis. Thus, I have evaluated the modelling standards for their most-used terminologies and constructs to identify their key components. It will also assist in the generalisation of the critical terms (of process modelling standards) based on their ontology. A set of generalised terms proposed would serve as an enumeration of the theory and subsume the core modelling elements of the process modelling standards. The catalogue presents a knowledge base for the business and healthcare domains, and its components are formally defined (semantics). Furthermore, a resolution theorem-proof is used to show the structural features of the theory (model) to establish it is sound and complete. After establishing that the theory is sound and complete, the next step is to provide the instantiation of the theory. This is achieved by mapping the core components of the theory to their corresponding instances. Additionally, a formal graphical tool termed as point graph (PG) is used to visualise the cases of the proposed axiomatic system. PG facilitates in modelling, and scheduling patient flows and enables analysing existing models for possible inaccuracies and inconsistencies supported by a reasoning mechanism based on PITL. Following that, a transformation is developed to map the core modelling components of the standards into the extended PG (PG*) based on the semantics presented by the axiomatic system. A real-life case (from the King’s College hospital accident and emergency (A&E) department’s trauma patient pathway) is considered to validate the framework. It is divided into three patient flows to depict the journey of a patient with significant trauma, arriving at A&E, undergoing a procedure and subsequently discharged. Their staff relied upon the UML-AD and BPMN to model the patient flows. An evaluation of their representation is presented to show the shortfalls of the modelling standards to model patient flows. The last step is to model these patient flows using the developed approach, which is supported by enhanced reasoning and scheduling

    SysML-Based Domain-Specific Executable Workflows

    Get PDF
    The Systems Modeling Language (SysML) is a general-purpose graphical modeling language for specifying, analyzing, designing, and verifying complex systems. This thesis presents a tool called SysFlow Workflow Engine (SWE) that is being developed to execute a domain workflow defined using SysML\u27s Activity Diagram. The thesis also describes extensions added to the SysML semantics to make them SWE executable. SWE focuses on grid computing, cyberinfrastructure and related domains; however, support for other domains can be easily added. SWE aims to provide a common interface to grid, cyberinfrastructure and other domain-specific software by abstracting their complexity and idiosyncrasies. To create a workflow, users can use SysML modelers such as Topcased, which allow them to create and validate SysML models. Before submitting a workflow to SWE for execution, users have to ensure that their workflow is not only a valid SysML model but also a valid SWE executable model. SWE receives a SysML workflow in XML Metadata Interchange (XMI) format and after performing certain validation checks, it parses and executes the workflow

    Formal Object Interaction Language: Modeling and Verification of Sequential and Concurrent Object-Oriented Software

    Get PDF
    As software systems become larger and more complex, developers require the ability to model abstract concepts while ensuring consistency across the entire project. The internet has changed the nature of software by increasing the desire for software deployment across multiple distributed platforms. Finally, increased dependence on technology requires assurance that designed software will perform its intended function. This thesis introduces the Formal Object Interaction Language (FOIL). FOIL is a new object-oriented modeling language specifically designed to address the cumulative shortcomings of existing modeling techniques. FOIL graphically displays software structure, sequential and concurrent behavior, process, and interaction in a simple unified notation, and has an algebraic representation based on a derivative of the π-calculus. The thesis documents the technique in which FOIL software models can be mathematically verified to anticipate deadlocks, ensure consistency, and determine object state reachability. Scalability is offered through the concept of behavioral inheritance; and, FOIL’s inherent support for modeling concurrent behavior and all known workflow patterns is demonstrated. The concepts of process achievability, process complete achievability, and process determinism are introduced with an algorithm for simulating the execution of a FOIL object model using a FOIL process model. Finally, a technique for using a FOIL process model as a constraint on FOIL object system execution is offered as a method to ensure that object-oriented systems modeled in FOIL will complete their processes based activities. FOIL’s capabilities are compared and contrasted with an extensive array of current software modeling techniques. FOIL is ideally suited for data-aware, behavior based systems such as interactive or process management software

    Quantitative Analysis of Apache Storm Applications: The NewsAsset Case Study

    Get PDF
    The development of Information Systems today faces the era of Big Data. Large volumes of information need to be processed in real-time, for example, for Facebook or Twitter analysis. This paper addresses the redesign of NewsAsset, a commercial product that helps journalists by providing services, which analyzes millions of media items from the social network in real-time. Technologies like Apache Storm can help enormously in this context. We have quantitatively analyzed the new design of NewsAsset to assess whether the introduction of Apache Storm can meet the demanding performance requirements of this media product. Our assessment approach, guided by the Unified Modeling Language (UML), takes advantage, for performance analysis, of the software designs already used for development. In addition, we converted UML into a domain-specific modeling language (DSML) for Apache Storm, thus creating a profile for Storm. Later, we transformed said DSML into an appropriate language for performance evaluation, specifically, stochastic Petri nets. The assessment ended with a successful software design that certainly met the scalability requirements of NewsAsset
    • …
    corecore