95,838 research outputs found

    Bayesian optimisation approach to quantify the effect of input parameter uncertainty on predictions of numerical physics simulations

    Full text link
    An understanding of how input parameter uncertainty in the numerical simulation of physical models leads to simulation output uncertainty is a challenging task. Common methods for quantifying output uncertainty, such as performing a grid or random search over the model input space, are computationally intractable for a large number of input parameters, represented by a high-dimensional input space. It is therefore generally unclear as to whether a numerical simulation can reproduce a particular outcome (e.g. a set of experimental results) with a plausible set of model input parameters. Here, we present a method for efficiently searching the input space using Bayesian Optimisation to minimise the difference between the simulation output and a set of experimental results. Our method allows explicit evaluation of the probability that the simulation can reproduce the measured experimental results in the region of input space defined by the uncertainty in each input parameter. We apply this method to the simulation of charge-carrier dynamics in the perovskite semiconductor methyl-ammonium lead iodide MAPbI3_3 that has attracted attention as a light harvesting material in solar cells. From our analysis we conclude that the formation of large polarons, quasiparticles created by the coupling of excess electrons or holes with ionic vibrations, cannot explain the experimentally observed temperature dependence of electron mobility

    A global sensitivity analysis of DNDC model using a Bayesian based approach

    Get PDF
    Non-Peer ReviewedThis study was aimed at demonstrate the application of the Bayesian based global sensitivity analysis (GSA) approach to denitrification and decomposition (DNDC) model using the tool of Gaussian emulation machine for sensitivity analysis (GEM-SA), in order to provide information on the relative effect of parameters on major model outputs. To execute the GSA study, twenty-eight input parameters were selected and eighty-six years’ DNDC simulation was run on basis of Three Hill’s spring wheat system. Three interested multi-year’s model outputs were chosen, whose sensitivity to inputs has been tested: yield, annual change in soil organic carbon (dSOC) and N2O flux. We found the effect of input parameters on three mentioned DNDC outputs not vary only with different simulated year but also with specific output variable. Moreover, the influence of inputs on variance of outputs varies with the form of sensitivity indices, i.e. main effects (individual contribution of each input to variance of model output) or total effects (when all inputs’ interactions are considered). Consequently, multi year’s SA is necessary for the nonlinear DNDC model and most sensitive parameters to specific output should be focused on further validation and calibration of that variable

    Semiconductor manufacturing simulation design and analysis with limited data

    Full text link
    This paper discusses simulation design and analysis for Silicon Carbide (SiC) manufacturing operations management at New York Power Electronics Manufacturing Consortium (PEMC) facility. Prior work has addressed the development of manufacturing system simulation as the decision support to solve the strategic equipment portfolio selection problem for the SiC fab design [1]. As we move into the phase of collecting data from the equipment purchased for the PEMC facility, we discuss how to redesign our manufacturing simulations and analyze their outputs to overcome the challenges that naturally arise in the presence of limited fab data. We conclude with insights on how an approach aimed to reflect learning from data can enable our discrete-event stochastic simulation to accurately estimate the performance measures for SiC manufacturing at the PEMC facility

    Bayesian Updating, Model Class Selection and Robust Stochastic Predictions of Structural Response

    Get PDF
    A fundamental issue when predicting structural response by using mathematical models is how to treat both modeling and excitation uncertainty. A general framework for this is presented which uses probability as a multi-valued conditional logic for quantitative plausible reasoning in the presence of uncertainty due to incomplete information. The fundamental probability models that represent the structure’s uncertain behavior are specified by the choice of a stochastic system model class: a set of input-output probability models for the structure and a prior probability distribution over this set that quantifies the relative plausibility of each model. A model class can be constructed from a parameterized deterministic structural model by stochastic embedding utilizing Jaynes’ Principle of Maximum Information Entropy. Robust predictive analyses use the entire model class with the probabilistic predictions of each model being weighted by its prior probability, or if structural response data is available, by its posterior probability from Bayes’ Theorem for the model class. Additional robustness to modeling uncertainty comes from combining the robust predictions of each model class in a set of competing candidates weighted by the prior or posterior probability of the model class, the latter being computed from Bayes’ Theorem. This higherlevel application of Bayes’ Theorem automatically applies a quantitative Ockham razor that penalizes the data-fit of more complex model classes that extract more information from the data. Robust predictive analyses involve integrals over highdimensional spaces that usually must be evaluated numerically. Published applications have used Laplace's method of asymptotic approximation or Markov Chain Monte Carlo algorithms

    The role of learning on industrial simulation design and analysis

    Full text link
    The capability of modeling real-world system operations has turned simulation into an indispensable problemsolving methodology for business system design and analysis. Today, simulation supports decisions ranging from sourcing to operations to finance, starting at the strategic level and proceeding towards tactical and operational levels of decision-making. In such a dynamic setting, the practice of simulation goes beyond being a static problem-solving exercise and requires integration with learning. This article discusses the role of learning in simulation design and analysis motivated by the needs of industrial problems and describes how selected tools of statistical learning can be utilized for this purpose

    Extending the Global Sensitivity Analysis of the SimSphere model in the Context of its Future Exploitation by the Scientific Community

    Get PDF
    In today’s changing climate, the development of robust, accurate and globally applicable models is imperative for a wider understanding of Earth’s terrestrial biosphere. Moreover, an understanding of the representation, sensitivity and coherence of such models are vital for the operationalisation of any physically based model. A Global Sensitivity Analysis (GSA) was conducted on the SimSphere land biosphere model in which a meta-modelling method adopting Bayesian theory was implemented. Initially, effects of assuming uniform probability distribution functions (PDFs) for the model inputs, when examining sensitivity of key quantities simulated by SimSphere at different output times, were examined. The development of topographic model input parameters (e.g., slope, aspect, and elevation) were derived within a Geographic Information System (GIS) before implementation within the model. The effect of time of the simulation on the sensitivity of previously examined outputs was also analysed. Results showed that simulated outputs were significantly influenced by changes in topographic input parameters, fractional vegetation cover, vegetation height and surface moisture availability in agreement with previous studies. Time of model output simulation had a significant influence on the absolute values of the output variance decomposition, but it did not seem to change the relative importance of each input parameter. Sensitivity Analysis (SA) results of the newly modelled outputs allowed identification of the most responsive model inputs and interactions. Our study presents an important step forward in SimSphere verification given the increasing interest in its use both as an independent modelling and educational tool. Furthermore, this study is very timely given on-going efforts towards the development of operational products based on the synergy of SimSphere with Earth Observation (EO) data. In this context, results also provide additional support for the potential applicability of the assimilation of spatial analysis data derived from GIS and EO data into an accurate modelling framework

    Hierarchical probabilistic macromodeling for QCA circuits

    Get PDF
    With the goal of building an hierarchical design methodology for quantum-dot cellular automata (QCA) circuits, we put forward a novel, theoretically sound, method for abstracting the behavior of circuit components in QCA circuit, such as majority logic, lines, wire-taps, cross-overs, inverters, and corners, using macromodels. Recognizing that the basic operation of QCA is probabilistic in nature, we propose probabilistic macromodels for standard QCA circuit elements based on conditional probability characterization, defined over the output states given the input states. Any circuit model is constructed by chaining together the individual logic element macromodels, forming a Bayesian network, defining a joint probability distribution over the whole circuit. We demonstrate three uses for these macromodel-based circuits. First, the probabilistic macromodels allow us to model the logical function of QCA circuits at an abstract level - the "circuit" level - above the current practice of layout level in a time and space efficient manner. We show that the circuit level model is orders of magnitude faster and requires less space than layout level models, making the design and testing of large QCA circuits efficient and relegating the costly full quantum-mechanical simulation of the temporal dynamics to a later stage in the design process. Second, the probabilistic macromodels abstract crucial device level characteristics such as polarization and low-energy error state configurations at the circuit level. We demonstrate how this macromodel-based circuit level representation can be used to infer the ground state probabilities, i.e., cell polarizations, a crucial QCA parameter. This allows us to study the thermal behavior of QCA circuits at a higher level of abstraction. Third, we demonstrate the use of these macromodels for error analysis. We show that low-energy state configurations of the macromodel circuit match those of the layout level, thus allowing us to isolate weak p- oints in circuits design at the circuit level itsel
    • …
    corecore