832 research outputs found

    Dynamic Bayesian Networks as a Probabilistic Metamodel for Combat Simulations

    Get PDF
    Simulation modeling is used in many situations. Simulation meta-modeling is used to estimate a simulation model result by representing the space of simulation model responses. Metamodeling methods are particularly useful when the simulation model is not particularly suited to real-time or mean real-time use. Most metamodeling methods provide expected value responses while some situations need probabilistic responses. This research establishes the viability of Dynamic Bayesian Networks for simulation metamodeling, those situations needing probabilistic responses. A bootstrapping method is introduced to reduce simulation data requirement for a DBN, and experimental design is shown to benefit a DBN used to represent a multi-dimensional response space. An improved interpolation method is developed and shown beneficial to DBN metamodeling applications. These contributions are employed in a military modeling case study to fully demonstrate the viability of DBN metamodeling for Defense analysis application

    Evolutionary model type selection for global surrogate modeling

    Get PDF
    Due to the scale and computational complexity of currently used simulation codes, global surrogate (metamodels) models have become indispensable tools for exploring and understanding the design space. Due to their compact formulation they are cheap to evaluate and thus readily facilitate visualization, design space exploration, rapid prototyping, and sensitivity analysis. They can also be used as accurate building blocks in design packages or larger simulation environments. Consequently, there is great interest in techniques that facilitate the construction of such approximation models while minimizing the computational cost and maximizing model accuracy. Many surrogate model types exist ( Support Vector Machines, Kriging, Neural Networks, etc.) but no type is optimal in all circumstances. Nor is there any hard theory available that can help make this choice. In this paper we present an automatic approach to the model type selection problem. We describe an adaptive global surrogate modeling environment with adaptive sampling, driven by speciated evolution. Different model types are evolved cooperatively using a Genetic Algorithm ( heterogeneous evolution) and compete to approximate the iteratively selected data. In this way the optimal model type and complexity for a given data set or simulation code can be dynamically determined. Its utility and performance is demonstrated on a number of problems where it outperforms traditional sequential execution of each model type

    Learning and Generalizing Polynomials in Simulation Metamodeling

    Full text link
    The ability to learn polynomials and generalize out-of-distribution is essential for simulation metamodels in many disciplines of engineering, where the time step updates are described by polynomials. While feed forward neural networks can fit any function, they cannot generalize out-of-distribution for higher-order polynomials. Therefore, this paper collects and proposes multiplicative neural network (MNN) architectures that are used as recursive building blocks for approximating higher-order polynomials. Our experiments show that MNNs are better than baseline models at generalizing, and their performance in validation is true to their performance in out-of-distribution tests. In addition to MNN architectures, a simulation metamodeling approach is proposed for simulations with polynomial time step updates. For these simulations, simulating a time interval can be performed in fewer steps by increasing the step size, which entails approximating higher-order polynomials. While our approach is compatible with any simulation with polynomial time step updates, a demonstration is shown for an epidemiology simulation model, which also shows the inductive bias in MNNs for learning and generalizing higher-order polynomials

    Emulating dynamic non-linear simulators using Gaussian processes

    Get PDF
    The dynamic emulation of non-linear deterministic computer codes where the output is a time series, possibly multivariate, is examined. Such computer models simulate the evolution of some real-world phenomenon over time, for example models of the climate or the functioning of the human brain. The models we are interested in are highly non-linear and exhibit tipping points, bifurcations and chaotic behaviour. However, each simulation run could be too time-consuming to perform analyses that require many runs, including quantifying the variation in model output with respect to changes in the inputs. Therefore, Gaussian process emulators are used to approximate the output of the code. To do this, the flow map of the system under study is emulated over a short time period. Then, it is used in an iterative way to predict the whole time series. A number of ways are proposed to take into account the uncertainty of inputs to the emulators, after fixed initial conditions, and the correlation between them through the time series. The methodology is illustrated with two examples: the highly non-linear dynamical systems described by the Lorenz and Van der Pol equations. In both cases, the predictive performance is relatively high and the measure of uncertainty provided by the method reflects the extent of predictability in each system

    Multiobjective global surrogate modeling, dealing with the 5-percent problem

    Get PDF
    When dealing with computationally expensive simulation codes or process measurement data, surrogate modeling methods are firmly established as facilitators for design space exploration, sensitivity analysis, visualization, prototyping and optimization. Typically the model parameter (=hyperparameter) optimization problem as part of global surrogate modeling is formulated in a single objective way. Models are generated according to a single objective (accuracy). However, this requires an engineer to determine a single accuracy target and measure upfront, which is hard to do if the behavior of the response is unknown. Likewise, the different outputs of a multi-output system are typically modeled separately by independent models. Again, a multiobjective approach would benefit the domain expert by giving information about output correlation and enabling automatic model type selection for each output dynamically. With this paper the authors attempt to increase awareness of the subtleties involved and discuss a number of solutions and applications. In particular, we present a multiobjective framework for global surrogate model generation to help tackle both problems and that is applicable in both the static and sequential design (adaptive sampling) case

    Statistical metamodeling of dynamic network loading

    Get PDF
    Dynamic traffic assignment models rely on a network performance module known as dynamic network loading (DNL), which expresses flow propagation, flow conservation, and travel delay at a network level. The DNL defines the so-called network delay operator, which maps a set of path departure rates to a set of path travel times (or costs). It is widely known that the delay operator is not available in closed form, and has undesirable properties that severely complicate DTA analysis and computation, such as discontinuity, non-differentiability, non-monotonicity, and computational inefficiency. This paper proposes a fresh take on this important and difficult issue, by providing a class of surrogate DNL models based on a statistical learning method known as Kriging. We present a metamodeling framework that systematically approximates DNL models and is flexible in the sense of allowing the modeler to make trade-offs among model granularity, complexity, and accuracy. It is shown that such surrogate DNL models yield highly accurate approximations (with errors below 8%) and superior computational efficiency (9 to 455 times faster than conventional DNL procedures such as those based on the link transmission model). Moreover, these approximate DNL models admit closed-form and analytical delay operators, which are Lipschitz continuous and infinitely differentiable, with closed-form Jacobians. We provide in-depth discussions on the implications of these properties to DTA research and model applications

    Introduction to metamodeling for reducing computational burden of advanced analyses with health economic models : a structured overview of metamodeling methods in a 6-step application process

    Get PDF
    Metamodels can be used to reduce the computational burden associated with computationally demanding analyses of simulation models, though applications within health economics are still scarce. Besides a lack of awareness of their potential within health economics, the absence of guidance on the conceivably complex and time-consuming process of developing and validating metamodels may contribute to their limited uptake. To address these issues, this paper introduces metamodeling to the wider health economic audience and presents a process for applying metamodeling in this context, including suitable methods and directions for their selection and use. General (i.e., non-health economic specific) metamodeling literature, clinical prediction modeling literature, and a previously published literature review were exploited to consolidate a process and to identify candidate metamodeling methods. Methods were considered applicable to health economics if they are able to account for mixed (i.e., continuous and discrete) input parameters and continuous outcomes. Six steps were identified as relevant for applying metamodeling methods within health economics, i.e. 1) the identification of a suitable metamodeling technique, 2) simulation of datasets according to a design of experiments, 3) fitting of the metamodel, 4) assessment of metamodel performance, 5) conduct the required analysis using the metamodel, and 6) verification of the results. Different methods are discussed to support each step, including their characteristics, directions for use, key references, and relevant R and Python packages. To address challenges regarding metamodeling methods selection, a first guide was developed towards using metamodels to reduce the computational burden of analyses of health economic models. This guidance may increase applications of metamodeling in health economics, enabling increased use of state-of-the-art analyses, e.g. value of information analysis, with computationally burdensome simulation models

    Fuzzy local linear approximation-based sequential design

    Get PDF
    When approximating complex high-fidelity black box simulators with surrogate models, the experimental design is often created sequentially. LOLA-Voronoi, a powerful state of the art method for sequential design combines an Exploitation and Exploration algorithm and adapts the sampling distribution to provide extra samples in non-linear regions. The LOLA algorithm estimates gradients to identify interesting regions, but has a bad complexity which results in long computation time when simulators are high-dimensional. In this paper, a new gradient estimation approach for the LOLA algorithm is proposed based on Fuzzy Logic. Experiments show the new method is a lot faster and results in experimental designs of comparable quality

    Estimating performance indexes of a baggage handling system using metamodels

    Full text link
    In this study, we develop some deterministic metamodels to quickly and precisely predict the future of a technically complex system. The underlying system is essentially a stochastic, discrete event simulation model of a big baggage handling system. The highly detailed simulation model of this is used for conducting some experiments and logging data which are then used for training artificial neural network metamodels. Demonstrated results show that the developed metamodels are well able to predict different performance measures related to the travel time of bags within this system. In contrast to the simulation models which are computationally expensive and expertise extensive to be developed, run, and maintained, the artificial neural network metamodels could serve as real time decision aiding tools which are considerably fast, precise, simple to use, and reliable.<br /
    corecore