6,250 research outputs found
Verification of Hierarchical Artifact Systems
Data-driven workflows, of which IBM's Business Artifacts are a prime
exponent, have been successfully deployed in practice, adopted in industrial
standards, and have spawned a rich body of research in academia, focused
primarily on static analysis. The present work represents a significant advance
on the problem of artifact verification, by considering a much richer and more
realistic model than in previous work, incorporating core elements of IBM's
successful Guard-Stage-Milestone model. In particular, the model features task
hierarchy, concurrency, and richer artifact data. It also allows database key
and foreign key dependencies, as well as arithmetic constraints. The results
show decidability of verification and establish its complexity, making use of
novel techniques including a hierarchy of Vector Addition Systems and a variant
of quantifier elimination tailored to our context.Comment: Full version of the accepted PODS pape
Recommended from our members
The long and winding road: Routine creation and replication in multi-site organizations
Prior research on organizational routines in the ‘capabilities’ literature has either studied how new routines are created during an exploratory process of variation and selection or how existing routines are replicated during a phase of exploitation. Few studies have analyzed the life cycle of new routine creation and replication as an integrated process. In an in-depth case study of England’s Highways Agency, this paper shows that the creation and replication of a new routine across multiple sites involves four sequential steps: envisioning, experimenting, entrenching and enacting. We contribute to the capabilities research in two ways: first, by showing how different organizational levels, capabilities and logics (cognitive and behavioural) shape the development of new routines; and second, by identifying how distinct evolutionary cycles of variation and selective retention occur during each step in the process. In contrast with prior research on replication as an exact copy of a template or existing routine, our study focuses on the replication of an entirely new routine (based on novel principles) that is adapted to fit local operational conditions during its large-scale replication across multiple sites. We draw upon insights from adjacent ‘practice research’ and suggest how capabilities and practice studies may complement each other in future research on the evolution of routines
Evolution of a supply chain management game for the trading agent competition
TAC SCM is a supply chain management game for the Trading Agent Competition (TAC). The purpose of TAC is to spur high quality research into realistic trading agent problems. We discuss TAC and TAC SCM: game and competition design, scientific impact, and lessons learnt
iObserve: Integrated Observation and Modeling Techniques to Support Adaptation and Evolution of Software Systems
The goal of iObserve is to develop methods and tools to support evolution and adaptation of long-lived software systems. Future long-living software systems will be engineered using third-party software services and infrastructures. Key challenges for such systems will be caused by dynamic changes of deployment options on cloud platforms. Third-party services and infrastructures are neither owned nor controlled by the users and developers of service-based systems. System users and developers are thus only able to observe third-party services and infrastructures via their interface, but are not able to look into the software and infrastructure that provides those services. In this technical report, we summarize our results of four activities to realize a complete tooling around Kieker, Palladio, and MAMBA, supporting performance and cost prediction, and the evaluation of data privacy in context of geo-locations. Furthermore, the report illustrates our efforts to extend Palladio
Higher-Order Process Modeling: Product-Lining, Variability Modeling and Beyond
We present a graphical and dynamic framework for binding and execution of
business) process models. It is tailored to integrate 1) ad hoc processes
modeled graphically, 2) third party services discovered in the (Inter)net, and
3) (dynamically) synthesized process chains that solve situation-specific
tasks, with the synthesis taking place not only at design time, but also at
runtime. Key to our approach is the introduction of type-safe stacked
second-order execution contexts that allow for higher-order process modeling.
Tamed by our underlying strict service-oriented notion of abstraction, this
approach is tailored also to be used by application experts with little
technical knowledge: users can select, modify, construct and then pass
(component) processes during process execution as if they were data. We
illustrate the impact and essence of our framework along a concrete, realistic
(business) process modeling scenario: the development of Springer's
browser-based Online Conference Service (OCS). The most advanced feature of our
new framework allows one to combine online synthesis with the integration of
the synthesized process into the running application. This ability leads to a
particularly flexible way of implementing self-adaption, and to a particularly
concise and powerful way of achieving variability not only at design time, but
also at runtime.Comment: In Proceedings Festschrift for Dave Schmidt, arXiv:1309.455
Reusing artifact-centric business process models : a behavioral consistent specialization approach
Process reuse is one of the important research areas that address efficiency issues in business process modeling. Similar to software reuse, business processes should be able to be componentized and specialized in order to enable flexible process expansion and customization. Current activity/control-flow centric workflow modeling approaches face difficulty in supporting highly flexible process reuse, limited by their procedural nature. In comparison, the emerging artifact-centric workflow modeling approach well fits into these reuse requirements. Beyond the classic class level reuse in existing object-oriented approaches, process reuse faces the challenge of handling synchronization dependencies among artifact lifecycles as parts of a business process. In this article, we propose a theoretical framework for business process specialization that comprises an artifact-centric business process model, a set of methods to design and construct a specialized business process model from a base model, and a set of behavioral consistency criteria to help check the consistency between the two process models. © 2020, Springer-Verlag GmbH Austria, part of Springer Nature
An Approach for Modeling and Coordinating Process Interactions
In any enterprise, different entities collaborate to achieve common business objectives. The processes used to reach these objectives have relations and, therefore, depend on each other. Their proper coordination within a process-aware information system requires coping with heterogeneous granularity of processes, unclear process relations, and increased process model complexity due
to the integration of coordination constraints into process models. This paper presents the concept of coordination processes, which constitute a means to handle the interactions between a multitude of interdependent processes running asynchronously to each other. Particularly, coordination processes leverage the clear identification of process relations, a defined granularity for processes, and the abstraction from details of the individual processes in order to provide a robust
framework, enabling proper coordination support for interdependent processes
USING HYBRID SCRUM TO MEET WATERFALL PROCESS DELIVERABLES
System Development Life Cycles (SDLCs) for organizations are often based upon traditional software development models such as the waterfall model. These processes are complex, heavy in documentation deliverables, and are rigid and less flexible than other methods being used in modern software development. Consider by contrast, agile methods for software development. In essence, agile methods recommend lightweight documentation and simplified process. The focus shifts to completed software as the "measure of success" for delivery of product in software projects, versus accurate and comprehensive documentation, and the accomplishment of static milestones in a work breakdown structure. This thesis implements, explores, and recommends a hybrid agile approach to Scrum in order to satisfy the rigid, document-laden deliverables of a waterfall-based SDLC process. This hybrid Scrum is a balance of having enough documentation and process - but not too much - to meet SDLC deliverables, while at the same time focusing on timely product delivery and customer interactions that come from an agile approach to software development.M.S
- …