31,945 research outputs found

    Uncertainty Analyses in the Finite-Difference Time-Domain Method

    Get PDF
    Providing estimates of the uncertainty in results obtained by Computational Electromagnetic (CEM) simulations is essential when determining the acceptability of the results. The Monte Carlo method (MCM) has been previously used to quantify the uncertainty in CEM simulations. Other computationally efficient methods have been investigated more recently, such as the polynomial chaos method (PCM) and the method of moments (MoM). This paper introduces a novel implementation of the PCM and the MoM into the finite-difference time -domain method. The PCM and the MoM are found to be computationally more efficient than the MCM, but can provide poorer estimates of the uncertainty in resonant electromagnetic compatibility data

    Quantifying statistical uncertainty in the attribution of human influence on severe weather

    Get PDF
    Event attribution in the context of climate change seeks to understand the role of anthropogenic greenhouse gas emissions on extreme weather events, either specific events or classes of events. A common approach to event attribution uses climate model output under factual (real-world) and counterfactual (world that might have been without anthropogenic greenhouse gas emissions) scenarios to estimate the probabilities of the event of interest under the two scenarios. Event attribution is then quantified by the ratio of the two probabilities. While this approach has been applied many times in the last 15 years, the statistical techniques used to estimate the risk ratio based on climate model ensembles have not drawn on the full set of methods available in the statistical literature and have in some cases used and interpreted the bootstrap method in non-standard ways. We present a precise frequentist statistical framework for quantifying the effect of sampling uncertainty on estimation of the risk ratio, propose the use of statistical methods that are new to event attribution, and evaluate a variety of methods using statistical simulations. We conclude that existing statistical methods not yet in use for event attribution have several advantages over the widely-used bootstrap, including better statistical performance in repeated samples and robustness to small estimated probabilities. Software for using the methods is available through the climextRemes package available for R or Python. While we focus on frequentist statistical methods, Bayesian methods are likely to be particularly useful when considering sources of uncertainty beyond sampling uncertainty.Comment: 41 pages, 11 figures, 1 tabl

    The role of learning on industrial simulation design and analysis

    Full text link
    The capability of modeling real-world system operations has turned simulation into an indispensable problemsolving methodology for business system design and analysis. Today, simulation supports decisions ranging from sourcing to operations to finance, starting at the strategic level and proceeding towards tactical and operational levels of decision-making. In such a dynamic setting, the practice of simulation goes beyond being a static problem-solving exercise and requires integration with learning. This article discusses the role of learning in simulation design and analysis motivated by the needs of industrial problems and describes how selected tools of statistical learning can be utilized for this purpose

    Quantifying dependencies for sensitivity analysis with multivariate input sample data

    Get PDF
    We present a novel method for quantifying dependencies in multivariate datasets, based on estimating the R\'{e}nyi entropy by minimum spanning trees (MSTs). The length of the MSTs can be used to order pairs of variables from strongly to weakly dependent, making it a useful tool for sensitivity analysis with dependent input variables. It is well-suited for cases where the input distribution is unknown and only a sample of the inputs is available. We introduce an estimator to quantify dependency based on the MST length, and investigate its properties with several numerical examples. To reduce the computational cost of constructing the exact MST for large datasets, we explore methods to compute approximations to the exact MST, and find the multilevel approach introduced recently by Zhong et al. (2015) to be the most accurate. We apply our proposed method to an artificial testcase based on the Ishigami function, as well as to a real-world testcase involving sediment transport in the North Sea. The results are consistent with prior knowledge and heuristic understanding, as well as with variance-based analysis using Sobol indices in the case where these indices can be computed

    Quantifying risk and uncertainty in macroeconomic forecasts

    Get PDF
    This paper discusses methods to quantify risk and uncertainty in macroeconomic forecasts. Both, parametric and non-parametric procedures are developed. The former are based on a class of asymmetrically weighted normal distributions whereas the latter employ asymmetric bootstrap simulations. Both procedures are closely related. The bootstrap is applied to the structural macroeconometric model of the Bundesbank for Germany. Forecast intervals that integrate judgement on risk and uncertainty are obtained. --Macroeconomic forecasts,stochastic forecast intervals,risk,uncertainty,asymmetrically weighted normal distribution,asymmetric bootstrap
    • …
    corecore