6 research outputs found

    Quantifying input uncertainty in an assemble-to-order system simulation with correlated input variables of mixed types

    Get PDF
    We consider an assemble-to-order production system where the product demands and the time since the last customer arrival are not independent. The simulation of this system requires a multivariate input model that generates random input vectors with correlated discrete and continuous components. In this paper, we capture the dependence between input variables in an undirected graphical model and decouple the statistical estimation of the univariate input distributions and the underlying dependence measure into separate problems. The estimation errors due to finiteness of the real-world data introduce the so-called input uncertainty in the simulation output. We propose a method that accounts for input uncertainty by sampling the univariate empirical distribution functions via bootstrapping and by maintaining a posterior distribution of the precision matrix that corresponds to the dependence structure of the graphical model. The method improves the coverages of the confidence intervals for the expected profit the per period. © 2014 IEEE

    Uncertainty and Error in Combat Modeling, Simulation, and Analysis

    Get PDF
    Due to the infrequent and competitive nature of combat, several challenges present themselves when developing a predictive simulation. First, there is limited data with which to validate such analysis tools. Secondly, there are many aspects of combat modeling that are highly uncertain and not knowable. This research develops a comprehensive set of techniques for the treatment of uncertainty and error in combat modeling and simulation analysis. First, Evidence Theory is demonstrated as a framework for representing epistemic uncertainty in combat modeling output. Next, a novel method for sensitivity analysis of uncertainty in Evidence Theory is developed. This sensitivity analysis method generates marginal cumulative plausibility functions (CPFs) and cumulative belief functions (CBFs) and prioritizes the contribution of each factor by the Wasserstein distance (also known as the Kantorovich or Earth Movers distance) between the CBF and CPF. Using this method, a rank ordering of the simulation input factors can be produced with respect to uncertainty. Lastly, a procedure for prioritizing the impact of modeling choices on simulation output uncertainty in settings where multiple models are employed is developed. This analysis provides insight into the overall sensitivities of the system with respect to multiple modeling choices

    Input Uncertainty and Data Collection Problems in Stochastic Simulation

    Get PDF
    Stochastic simulation is an important tool within the field of operational research. It allows for the behaviour of random real-world systems to be analysed, evaluated, and optimised. It is critical to understand the uncertainty and error in outcomes from simulation experiments, to ensure that decisions are made with appropriate levels of confidence. Frequently, input models that actuate stochastic simulations are estimated using samples of real-world data. This introduces a source of uncertainty into the simulation model which propagates through to output measures, causing an error known as input uncertainty. Input uncertainty depends on the samples of data that are collected and used to estimate the input models for the simulation. In this thesis, we consider problems relating to input uncertainty and data collection in the field of stochastic simulation. Firstly, we propose an algorithm that guides the data collection procedure for a simulation experiment in a manner that minimises input uncertainty. Reducing the uncertainty around the simulation response allows for improved insights to be inferred from simulation results. Secondly, we outline an approach for comparing data collection strategies in terms of the input uncertainty passed to outputs in simulations of viral loads. This represents a different type of data collection problem to the ones usually studied in simulation experiments. Thirdly, we adapt two techniques for quantifying input uncertainty to consider a quantile of the simulation outputs, rather than the mean. Quantiles are regularly used to provide alternative information regarding the simulation outputs relative to the mean, therefore it is equally important to understand the uncertainty of such measures. Finally, we begin to investigate how input uncertainty impacts predictive models fit to simulation data. This relates to the field of simulation analytics, a novel and emergent area of research where the problem of input uncertainty has not previously been examined

    Quantifying and reducing Input modelling error in simulation

    Get PDF
    This thesis presents new methodology in the field of quantifying and reducing input modelling error in computer simulation. Input modelling error is the uncertainty in the output of a simulation that propagates from the errors in the input models used to drive it. When the input models are estimated from observations of the real-world system input modelling error will always arise as only a finite number of observations can ever be collected. Input modelling error can be broken down into two components: variance, known in the literature as input uncertainty; and bias. In this thesis new methodology is contributed for the quantification of both of these sources of error. To date research into input modelling error has been focused on quantifying the input uncertainty (IU) variance. In this thesis current IU quantification techniques for simulation models with time homogeneous inputs are extended to simulation models with nonstationary input processes. Unlike the IU variance, the bias caused by input modelling has, until now, been virtually ignored. This thesis provides the first method for quantifying bias caused by input modelling. Also presented is a bias detection test for identifying, with controlled power, a bias due to input modelling of a size that would be concerning to a practitioner. The final contribution of this thesis is a spline-based arrival process model. By utilising a highly flexible spline representation, the error in the input model is reduced; it is believed that this will also reduce the input modelling error that passes to the simulation output. The methods described in this thesis are not available in the current literature and can be used in a wide range of simulation contexts for quantifying input modelling error and modelling input processes

    Optimisation of data collection strategies for model-based evaluation and decision-making

    Get PDF
    PhD ThesisProbabilistic and stochastic models are routinely used in performance, dependability and, more recently, security evaluation. Models allow predictions to be made about the performance of the current system or alternative configurations. Determining appropriate values for model parameters is a long-standing problem in the practical use of such models. With the increasing emphasis on human aspects and business considerations, data collection to estimate parameter values often gets prohibitively expensive, since it may involve questionnaires, costly audits or additional monitoring and processing. Existing work in this area often simply recommends when more data is needed rather than how much, or allocates additional samples without consideration of the wider data collection problem. This thesis aims to facilitate the design of optimal data collection strategies for such models, looking especially at applications in security decision-making. The main idea is to model the uncertainty of potential data collection strategies, and determine its influence on output accuracy by using and solving the model. This thesis provides a discussion of the factors affecting the data collection problem and then defines it formally as an optimisation problem. A number of methods for modelling data collection uncertainty are presented and these methods provide the basis for solvable algorithms. An implementation of the algorithms in MATLAB will be explained and then demonstrated using a business workflow model, and other smaller examples. These methods will be presented, tested, and evaluated with a number of efficiency improvements based upon importance sampling and design of experiment techniques.HP Labs for funding our research ‘Prediction and Provenance for Multi-Objective Information Security Management’ through its Innovation Research Program, and our original project partners: Doug Eskins, Robin Berthier, Bill Sanders, and Simon Parkin. ii
    corecore