1,248 research outputs found
Elastic Business Process Management: State of the Art and Open Challenges for BPM in the Cloud
With the advent of cloud computing, organizations are nowadays able to react
rapidly to changing demands for computational resources. Not only individual
applications can be hosted on virtual cloud infrastructures, but also complete
business processes. This allows the realization of so-called elastic processes,
i.e., processes which are carried out using elastic cloud resources. Despite
the manifold benefits of elastic processes, there is still a lack of solutions
supporting them.
In this paper, we identify the state of the art of elastic Business Process
Management with a focus on infrastructural challenges. We conceptualize an
architecture for an elastic Business Process Management System and discuss
existing work on scheduling, resource allocation, monitoring, decentralized
coordination, and state management for elastic processes. Furthermore, we
present two representative elastic Business Process Management Systems which
are intended to counter these challenges. Based on our findings, we identify
open issues and outline possible research directions for the realization of
elastic processes and elastic Business Process Management.Comment: Please cite as: S. Schulte, C. Janiesch, S. Venugopal, I. Weber, and
P. Hoenisch (2015). Elastic Business Process Management: State of the Art and
Open Challenges for BPM in the Cloud. Future Generation Computer Systems,
Volume NN, Number N, NN-NN., http://dx.doi.org/10.1016/j.future.2014.09.00
Recommended from our members
Real-time sensor data development for smart truck drivetrains
Heavy articulated transport vehicles have a poor reputation associated with dramatic road accidents with frequent fatalities for those in automobiles. The result of this work is a formal data flow structure to enhance real-time decision-making in complex mechanical systems to increase performance capability and responsiveness to human commands. This structure recognizes the multiple layers of highly non-linear mechanical components (actuators, wheel tire & ground surfaces, controllers, power supplies, human/machine interfaces, etc.) that must operate in unison (i.e., reduce conflicts) in real-time (in milli-seconds) to enhance operator (driver) control to maximize human choice. This work contains a discussion on dependable sensor data is vital in complex systems that rely on a suite of sensors for both control as well as condition monitoring purposes as well as discussion on real-time energy distribution analysis in high momentum mechanical systems. The focus will be on tractor trucks of class 7 & 8 that are outfitted with an array of low-cost redundant sensors leveraging advances in intelligent robotic systems. This work details many topics including: Most relevant sensor types and their technologies, Designing, implementing, and maintaining a multi-sensor system using feasible industry standards, Sensor signal integrity and data flow processing for decision making, Asynchronous data flow methods for operating decision making schemes in real-time, Multiple applications to enhance tractor trucks systems with multi-sensor systems for real-time decision making.Mechanical Engineerin
Knowledge visualizations: a tool to achieve optimized operational decision making and data integration
The overabundance of data created by modern information systems (IS) has led to a breakdown in cognitive decision-making. Without authoritative source data, commanders’ decision-making processes are hindered as they attempt to paint an accurate shared operational picture (SOP). Further impeding the decision-making process is the lack of proper interface interaction to provide a visualization that aids in the extraction of the most relevant and accurate data. Utilizing the DSS to present visualizations based on OLAP cube integrated data allow decision-makers to rapidly glean information and build their situation awareness (SA). This yields a competitive advantage to the organization while in garrison or in combat. Additionally, OLAP cube data integration enables analysis to be performed on an organization’s data-flows. This analysis is used to identify the critical path of data throughout the organization. Linking a decision-maker to the authoritative data along this critical path eliminates the many decision layers in a hierarchal command structure that can introduce latency or error into the decision-making process. Furthermore, the organization has an integrated SOP from which to rapidly build SA, and make effective and efficient decisions.http://archive.org/details/knowledgevisuali1094545877Outstanding ThesisOutstanding ThesisMajor, United States Marine CorpsCaptain, United States Marine CorpsApproved for public release; distribution is unlimited
Real-world Machine Learning Systems: A survey from a Data-Oriented Architecture Perspective
Machine Learning models are being deployed as parts of real-world systems
with the upsurge of interest in artificial intelligence. The design,
implementation, and maintenance of such systems are challenged by real-world
environments that produce larger amounts of heterogeneous data and users
requiring increasingly faster responses with efficient resource consumption.
These requirements push prevalent software architectures to the limit when
deploying ML-based systems. Data-oriented Architecture (DOA) is an emerging
concept that equips systems better for integrating ML models. DOA extends
current architectures to create data-driven, loosely coupled, decentralised,
open systems. Even though papers on deployed ML-based systems do not mention
DOA, their authors made design decisions that implicitly follow DOA. The
reasons why, how, and the extent to which DOA is adopted in these systems are
unclear. Implicit design decisions limit the practitioners' knowledge of DOA to
design ML-based systems in the real world. This paper answers these questions
by surveying real-world deployments of ML-based systems. The survey shows the
design decisions of the systems and the requirements these satisfy. Based on
the survey findings, we also formulate practical advice to facilitate the
deployment of ML-based systems. Finally, we outline open challenges to
deploying DOA-based systems that integrate ML models.Comment: Under revie
Modelling for data management & exchange in Concurrent Engineering - A case study of civil aircraft assembly line
This research aims to improve the dataflow performance of the Concurrent
Engineering (CE) practice in the detail design stage of the aircraft Assembly
Line (AL) in the C919 aircraft project. As the final integrator of the aircraft,
Shanghai Aircraft Manufacturing Company Ltd. (SAMC) is responsible for
developing the AL with global suppliers. Although CE has been implemented in
AL projects to shorten lead time, reduce development cost and improve design
quality, the lack of experience and insufficient infrastructure may lead to many
challenges in cooperation with distributed suppliers, especially regarding data
management/exchange and workflow control. In this research, the particular CE
environment and activities in SAMC AL projects were investigated. By
assessing the CE performance and benchmarking, the improvement
opportunities are identified, and then an activity-oriented workflow and dataflow
model is established by decomposing the work process to detail levels. Based
on this model, a Product Data Management (PDM) based support platform is
proposed to facilitate data management/exchange in dynamic workflow to
improve work efficiency and interoperability. This solution is mocked-up on the
Siemens Teamcenter 8.1 PLM(Product Lifecycle Management) software and its
feasibility is checked. The mock-up is evaluated by SAMC experts and suppliers.
The feedback shows the acceptance of the model by experts and the urgency
of improving data/work flow design before PLM implementing.
The result of this research is useful for enterprises in similar environments
transiting from pre-PLM to implementing PLM and who wanting to strengthen
CE in the new product development
A Review and Characterization of Progressive Visual Analytics
Progressive Visual Analytics (PVA) has gained increasing attention over the past years.
It brings the user into the loop during otherwise long-running and non-transparent computations
by producing intermediate partial results. These partial results can be shown to the user
for early and continuous interaction with the emerging end result even while it is still being
computed. Yet as clear-cut as this fundamental idea seems, the existing body of literature puts forth
various interpretations and instantiations that have created a research domain of competing terms,
various definitions, as well as long lists of practical requirements and design guidelines spread across
different scientific communities. This makes it more and more difficult to get a succinct understanding
of PVA’s principal concepts, let alone an overview of this increasingly diverging field. The review and
discussion of PVA presented in this paper address these issues and provide (1) a literature collection
on this topic, (2) a conceptual characterization of PVA, as well as (3) a consolidated set of practical
recommendations for implementing and using PVA-based visual analytics solutions
Dynamic data flow testing
Data flow testing is a particular form of testing that identifies data flow relations as test objectives. Data flow testing has recently attracted new interest in the context of testing object oriented systems, since data flow information is well suited to capture relations among the object states, and can thus provide useful information for testing method interactions. Unfortunately, classic data flow testing, which is based on static analysis of the source code, fails to identify many important data flow relations due to the dynamic nature of object oriented systems. This thesis presents Dynamic Data Flow Testing, a technique which rethinks data flow testing to suit the testing of modern object oriented software. Dynamic Data Flow Testing stems from empirical evidence that we collect on the limits of classic data flow testing techniques. We investigate such limits by means of Dynamic Data Flow Analysis, a dynamic implementation of data flow analysis that computes sound data flow information on program traces. We compare data flow information collected with static analysis of the code with information observed dynamically on execution traces, and empirically observe that the data flow information computed with classic analysis of the source code misses a significant part of information that corresponds to relevant behaviors that shall be tested. In view of these results, we propose Dynamic Data Flow Testing. The technique promotes the synergies between dynamic analysis, static reasoning and test case generation for automatically extending a test suite with test cases that execute the complex state based interactions between objects. Dynamic Data Flow Testing computes precise data flow information of the program with Dynamic Data Flow Analysis, processes the dynamic information to infer new test objectives, which Dynamic Data Flow Testing uses to generate new test cases. The test cases generated by Dynamic Data Flow Testing exercise relevant behaviors that are otherwise missed by both the original test suite and test suites that satisfy classic data flow criteria
- …