31,955 research outputs found
Can involving clients in simulation studies help them solve their future problems? A transfer of learning experiment
It is often stated that involving the client in operational research studies increases conceptual learning about a system which can then be applied repeatedly to other, similar, systems. Our study provides a novel measurement approach for behavioural OR studies that aim to analyse the impact of modelling in long term problem solving and decision making. In particular, our approach is the first to operationalise the measurement of transfer of learning from modelling using the concepts of close and far transfer, and overconfidence. We investigate learning in discrete-event simulation (DES) projects through an experimental study. Participants were trained to manage queuing problems by varying the degree to which they were involved in building and using a DES model of a hospital emergency department. They were then asked to transfer learning to a set of analogous problems. Findings demonstrate that transfer of learning from a simulation study is difficult, but possible. However, this learning is only accessible when sufficient time is provided for clients to process the structural behaviour of the model. Overconfidence is also an issue when the clients who were involved in model building attempt to transfer their learning without the aid of a new model. Behavioural OR studies that aim to understand learning from modelling can ultimately improve our modelling interactions with clients; helping to ensure the benefits for a longer term; and enabling modelling efforts to become more sustainable
Integrating heterogeneous distributed COTS discrete-event simulation packages: An emerging standards-based approach
This paper reports on the progress made toward the emergence of standards to support the integration of heterogeneous discrete-event simulations (DESs) created in specialist support tools called commercial-off-the-shelf (COTS) discrete-event simulation packages (CSPs). The general standard for heterogeneous integration in this area has been developed from research in distributed simulation and is the IEEE 1516 standard The High Level Architecture (HLA). However, the specific needs of heterogeneous CSP integration require that the HLA is augmented by additional complementary standards. These are the suite of CSP interoperability (CSPI) standards being developed under the Simulation Interoperability Standards Organization (SISO-http://www.sisostds.org) by the CSPI Product Development Group (CSPI-PDG). The suite consists of several interoperability reference models (IRMs) that outline different integration needs of CSPI, interoperability frameworks (IFs) that define the HLA-based solution to each IRM, appropriate data exchange representations to specify the data exchanged in an IF, and benchmarks termed CSP emulators (CSPEs). This paper contributes to the development of the Type I IF that is intended to represent the HLA-based solution to the problem outlined by the Type I IRM (asynchronous entity passing) by developing the entity transfer specification (ETS) data exchange representation. The use of the ETS in an illustrative case study implemented using a prototype CSPE is shown. This case study also allows us to highlight the importance of event granularity and lookahead in the performance and development of the Type I IF, and to discuss possible methods to automate the capture of appropriate values of lookahead
Distributed simulation with COTS simulation packages: A case study in health care supply chain simulation
The UK National Blood Service (NBS) is a public funded body that is responsible for distributing blood and asso-ciated products. A discrete-event simulation of the NBS supply chain in the Southampton area has been built using the commercial off-the-shelf simulation package (CSP) Simul8. This models the relationship in the health care supply chain between the NBS Processing, Testing and Is-suing (PTI) facility and its associated hospitals. However, as the number of hospitals increase simulation run time be-comes inconveniently large. Using distributed simulation to try to solve this problem, researchers have used techniques informed by SISO’s CSPI PDG to create a version of Simul8 compatible with the High Level Architecture (HLA). The NBS supply chain model was subsequently divided into several sub-models, each running in its own copy of Simul8. Experimentation shows that this distri-buted version performs better than its standalone, conven-tional counterpart as the number of hospitals increases
Recommended from our members
Simulation for business processes and information systems design
Business Process (BP) literature promotes the value of business processes as essential gearwheels that help organizations to reach their goals. Similarly, many process design approaches claim that Information Technology (IT) is a major enabler of business process, a view also shared by the Information Systems (IS) community. Despite this, BP and IS approaches do not provide clear guidance on how to assess the benefits that a given IS design may bring to the BP prior the IS implementation. Nor is clear indication of which modeling techniques could be used to assess such relationship. This paper uses the insights gained during a UK funded research project, namely ASSESS-IT, that aimed to depict the dynamic relationships between BP and IT to propose an alternative framework to develop BP simulation models that depict the dynamic behavior of the relationships between BP and IS
Sensitivity analysis and optimization of system dynamics models: Regression analysis and statistical design of experiments
This tutorial discusses what-if analysis and optimization of System Dynamics models. These problems are solved, using the statistical techniques of regression analysis and design of experiments (DOE). These issues are illustrated by applying the statistical techniques to a System Dynamics model for coal transportation, taken from Wolstenholme's book "System Enquiry: a System Dynamics Approach" (1990). The regression analysis uses the least squares algorithm. DOE uses classic designs, namely, fractional factorials and central composite designs. Compared with intuitive approaches, DOE is more efficient: DOE gives more accurate estimators of input effects. Moreover DOE is more effective: interactions are estimable too. The System Dynamics model is also optimized. A heuristic is derived, inspired by Response Surface Methodology (RSM) but accounting for constraints. Some remaining pertinent problems are briefly discussed, namely DOE for cases with many factors, and DOE for random System Dynamics models. Conclusions are presented for the case study, and general principles are derived. Finally 23 references are given for further study.Regression Analysis;Experimental Design;System Dynamics Models;statistics
Design of experiments for non-manufacturing processes : benefits, challenges and some examples
Design of Experiments (DoE) is a powerful technique for process optimization that has been widely deployed in almost all types of manufacturing processes and is used extensively in product and process design and development. There have not been as many efforts to apply powerful quality improvement techniques such as DoE to improve non-manufacturing processes. Factor levels often involve changing the way people work and so have to be handled carefully. It is even more important to get everyone working as a team. This paper explores the benefits and challenges in the application of DoE in non-manufacturing contexts. The viewpoints regarding the benefits and challenges of DoE in the non-manufacturing arena are gathered from a number of leading academics and practitioners in the field. The paper also makes an attempt to demystify the fact that DoE is not just applicable to manufacturing industries; rather it is equally applicable to non-manufacturing processes within manufacturing companies. The last part of the paper illustrates some case examples showing the power of the technique in non-manufacturing environments
Screening Experiments for Simulation: A Review
This article reviews so-called screening in simulation; i.e., it examines the search for the really important factors in experiments with simulation models that have very many factors (or inputs). The article focuses on a most efficient and effec- tive screening method, namely Sequential Bifurcation. It ends with a discussion of possible topics for future research, and forty references for further study.Screening;Metamodel;Response Surface;Design
A feedback based solution to emulate hidden terminals in wireless networks
Mobile wireless emulation allows the test of real applications and transport protocols over a wired network mimicking the behavior of a mobile wireless network (nodes mobility, radio signal propagation and specific communication protocols). Two-stage IP-level network emulation consists in using a dedicated offline simulation stage to compute an IPlevel emulation scenario, which is played subsequently in the emulation stage. While this type of emulation allows the use of accurate computation models together with a large number of nodes, it currently does not allow to deal with dynamic changes of the real traffic. This lack of reactivity makes it impossible to emulate specific wireless behaviors such as hidden terminals in a realistic way. In this paper we address the need to take into account the real traffic during the emulation stage and we introduce a feedback mechanism. During the simulation several emulation scenarios are computed, each scenario corresponding to alternative traffic conditions related to e.g. occurrence or not of hidden terminals. During the emulation stage, the traffic is observed and the currently played emulation scenario can be changed according to specific network conditions. We propose a solution based on multiple scenarios generation, traffic observers and a feedback mechanism to add a trafficbased dynamic behavior to a two-stage emulation platform. The solution will be illustrated with a simple experiment based on hidden terminals
- …