27,272 research outputs found
Negotiation and Decision Making to Develop a Public-Private-Partnership: A Case-Based Approach
Decision making in practice varies from theoretical models and processes. Unpredictable and ill-structured operating conditions require dynamic resolution approaches underpinned by effective negotiation and decision making strategies to support collaborative work and partnerships. This short paper evaluates negotiation strategies and decision making approaches adopted to reach agreement for a unique Public-Private-Partnership. It examines how decision criteria were formulated and decision rules generated through negotiation process executions, and uncertainties addressed by adopting multi-criteria and evidential reasoning approach. Findings are presented to help improve business performance in future PPPs by making effective decisions based on experience gained through past process execution
Predictive Monitoring of Business Processes
Modern information systems that support complex business processes generally
maintain significant amounts of process execution data, particularly records of
events corresponding to the execution of activities (event logs). In this
paper, we present an approach to analyze such event logs in order to
predictively monitor business goals during business process execution. At any
point during an execution of a process, the user can define business goals in
the form of linear temporal logic rules. When an activity is being executed,
the framework identifies input data values that are more (or less) likely to
lead to the achievement of each business goal. Unlike reactive compliance
monitoring approaches that detect violations only after they have occurred, our
predictive monitoring approach provides early advice so that users can steer
ongoing process executions towards the achievement of business goals. In other
words, violations are predicted (and potentially prevented) rather than merely
detected. The approach has been implemented in the ProM process mining toolset
and validated on a real-life log pertaining to the treatment of cancer patients
in a large hospital
Alarm-Based Prescriptive Process Monitoring
Predictive process monitoring is concerned with the analysis of events
produced during the execution of a process in order to predict the future state
of ongoing cases thereof. Existing techniques in this field are able to
predict, at each step of a case, the likelihood that the case will end up in an
undesired outcome. These techniques, however, do not take into account what
process workers may do with the generated predictions in order to decrease the
likelihood of undesired outcomes. This paper proposes a framework for
prescriptive process monitoring, which extends predictive process monitoring
approaches with the concepts of alarms, interventions, compensations, and
mitigation effects. The framework incorporates a parameterized cost model to
assess the cost-benefit tradeoffs of applying prescriptive process monitoring
in a given setting. The paper also outlines an approach to optimize the
generation of alarms given a dataset and a set of cost model parameters. The
proposed approach is empirically evaluated using a range of real-life event
logs
Service Learning Across the Curriculum: A Collaboration to Promote Smoking Cessation
This paper focuses on how pedagogy, service, and scholarship can be combined across the advertising curriculum through service learning, which invigorates collaboration among faculty members, student teams, and advertising professionals. The authors demonstrate how service learning projects integrate curricula using a community-based client, ultimately leading to scholarship and professional outcomes. Specifically, this study analyzes the launch of a service learning-based smoking cessation campaign on a Midwest college campus
Incremental Predictive Process Monitoring: How to Deal with the Variability of Real Environments
A characteristic of existing predictive process monitoring techniques is to
first construct a predictive model based on past process executions, and then
use it to predict the future of new ongoing cases, without the possibility of
updating it with new cases when they complete their execution. This can make
predictive process monitoring too rigid to deal with the variability of
processes working in real environments that continuously evolve and/or exhibit
new variant behaviors over time. As a solution to this problem, we propose the
use of algorithms that allow the incremental construction of the predictive
model. These incremental learning algorithms update the model whenever new
cases become available so that the predictive model evolves over time to fit
the current circumstances. The algorithms have been implemented using different
case encoding strategies and evaluated on a number of real and synthetic
datasets. The results provide a first evidence of the potential of incremental
learning strategies for predicting process monitoring in real environments, and
of the impact of different case encoding strategies in this setting
Assessing the impact of algorithmic trading on markets: a simulation approach
Innovative automated execution strategies like Algorithmic Trading gain significant market share on electronic market venues worldwide, although their impact on market outcome has not been investigated in depth yet. In order to assess the impact of such concepts, e.g. effects on the price formation or the volatility of prices, a simulation environment is presented that provides stylized implementations of algorithmic trading behavior and allows for modeling latency. As simulations allow for reproducing exactly the same basic situation, an assessment of the impact of algorithmic trading models can be conducted by comparing different simulation runs including and excluding a trader constituting an algorithmic trading model in its trading behavior. By this means the impact of Algorithmic Trading on different characteristics of market outcome can be assessed. The results indicate that large volumes to execute by the algorithmic trader have an increasing impact on market prices. On the other hand, lower latency appears to lower market volatility
Acceleration-as-a-Service: Exploiting Virtualised GPUs for a Financial Application
'How can GPU acceleration be obtained as a service in a cluster?' This
question has become increasingly significant due to the inefficiency of
installing GPUs on all nodes of a cluster. The research reported in this paper
is motivated to address the above question by employing rCUDA (remote CUDA), a
framework that facilitates Acceleration-as-a-Service (AaaS), such that the
nodes of a cluster can request the acceleration of a set of remote GPUs on
demand. The rCUDA framework exploits virtualisation and ensures that multiple
nodes can share the same GPU. In this paper we test the feasibility of the
rCUDA framework on a real-world application employed in the financial risk
industry that can benefit from AaaS in the production setting. The results
confirm the feasibility of rCUDA and highlight that rCUDA achieves similar
performance compared to CUDA, provides consistent results, and more
importantly, allows for a single application to benefit from all the GPUs
available in the cluster without loosing efficiency.Comment: 11th IEEE International Conference on eScience (IEEE eScience) -
Munich, Germany, 201
- …