15,011 research outputs found

    Bootstrap prediction mean squared errors of unobserved states based on the Kalman filter with estimated parameters

    Get PDF
    Prediction intervals in State Space models can be obtained by assuming Gaussian innovations and using the prediction equations of the Kalman filter, where the true parameters are substituted by consistent estimates. This approach has two limitations. First, it does not incorporate the uncertainty due to parameter estimation. Second, the Gaussianity assumption of future innovations may be inaccurate. To overcome these drawbacks, Wall and Stoffer (2002) propose to obtain prediction intervals by using a bootstrap procedure that requires the backward representation of the model. Obtaining this representation increases the complexity of the procedure and limits its implementation to models for which it exists. The bootstrap procedure proposed by Wall and Stoffer (2002) is further complicated by fact that the intervals are obtained for the prediction errors instead of for the observations. In this paper, we propose a bootstrap procedure for constructing prediction intervals in State Space models that does not need the backward representation of the model and is based on obtaining the intervals directly for the observations. Therefore, its application is much simpler, without loosing the good behavior of bootstrap prediction intervals. We study its finite sample properties and compare them with those of the standard and the Wall and Stoffer (2002) procedures for the Local Level Model. Finally, we illustrate the results by implementing the new procedure to obtain prediction intervals for future values of a real time series

    Statistical Testing of Optimality Conditions in Multiresponse Simulation-Based Optimization (Replaced by Discussion Paper 2007-45)

    Get PDF
    This paper derives a novel procedure for testing the Karush-Kuhn-Tucker (KKT) first-order optimality conditions in models with multiple random responses.Such models arise in simulation-based optimization with multivariate outputs.This paper focuses on expensive simulations, which have small sample sizes.The paper estimates the gradients (in the KKT conditions) through low-order polynomials, fitted locally.These polynomials are estimated using Ordinary Least Squares (OLS), which also enables estimation of the variability of the estimated gradients.Using these OLS results, the paper applies the bootstrap (resampling) method to test the KKT conditions.Furthermore, it applies the classic Student t test to check whether the simulation outputs are feasible, and whether any constraints are binding.The paper applies the new procedure to both a synthetic example and an inventory simulation; the empirical results are encouraging.stopping rule;metaheuristics;RSM;design of experiments

    Reexamining the linkages between inflation and output growth: A bivariate ARFIMA-FIGARCH approach

    Get PDF
    In this paper, given recent theoretical developments that inflation can exhibit long memory properties due to the output growth process, we propose a new class of bivariate processes to simultaneously investigate the dual long memory properties in the mean and the conditional variance of inflation and output growth series. We estimate the model using monthly UK data and document the presence of dual long memory properties in both series. Then, using the conditional variances generated from our bivariate model, we employ Granger causality tests to scrutinize the linkages between the means and the volatilities of inflation and output growth.

    Overcoming the data crisis in biodiversity conservation

    Get PDF
    How can we track population trends when monitoring data are sparse? Population declines can go undetected, despite ongoing threats. For example, only one of every 200 harvested species are monitored. This gap leads to uncertainty about the seriousness of declines and hampers effective conservation. Collecting more data is important, but we can also make better use of existing information. Prior knowledge of physiology, life history, and community ecology can be used to inform population models. Additionally, in multispecies models, information can be shared among taxa based on phylogenetic, spatial, or temporal proximity. By exploiting generalities across species that share evolutionary or ecological characteristics within Bayesian hierarchical models, we can fill crucial gaps in the assessment of species’ status with unparalleled quantitative rigor

    Enabling High-Dimensional Hierarchical Uncertainty Quantification by ANOVA and Tensor-Train Decomposition

    Get PDF
    Hierarchical uncertainty quantification can reduce the computational cost of stochastic circuit simulation by employing spectral methods at different levels. This paper presents an efficient framework to simulate hierarchically some challenging stochastic circuits/systems that include high-dimensional subsystems. Due to the high parameter dimensionality, it is challenging to both extract surrogate models at the low level of the design hierarchy and to handle them in the high-level simulation. In this paper, we develop an efficient ANOVA-based stochastic circuit/MEMS simulator to extract efficiently the surrogate models at the low level. In order to avoid the curse of dimensionality, we employ tensor-train decomposition at the high level to construct the basis functions and Gauss quadrature points. As a demonstration, we verify our algorithm on a stochastic oscillator with four MEMS capacitors and 184 random parameters. This challenging example is simulated efficiently by our simulator at the cost of only 10 minutes in MATLAB on a regular personal computer.Comment: 14 pages (IEEE double column), 11 figure, accepted by IEEE Trans CAD of Integrated Circuits and System

    Fixed-width output analysis for Markov chain Monte Carlo

    Get PDF
    Markov chain Monte Carlo is a method of producing a correlated sample in order to estimate features of a target distribution via ergodic averages. A fundamental question is when should sampling stop? That is, when are the ergodic averages good estimates of the desired quantities? We consider a method that stops the simulation when the width of a confidence interval based on an ergodic average is less than a user-specified value. Hence calculating a Monte Carlo standard error is a critical step in assessing the simulation output. We consider the regenerative simulation and batch means methods of estimating the variance of the asymptotic normal distribution. We give sufficient conditions for the strong consistency of both methods and investigate their finite sample properties in a variety of examples

    Experimental Design for Sensitivity Analysis, Optimization and Validation of Simulation Models

    Get PDF
    This chapter gives a survey on the use of statistical designs for what-if analysis in simula- tion, including sensitivity analysis, optimization, and validation/verification. Sensitivity analysis is divided into two phases. The first phase is a pilot stage, which consists of screening or searching for the important factors among (say) hundreds of potentially important factors. A novel screening technique is presented, namely sequential bifurcation. The second phase uses regression analysis to approximate the input/output transformation that is implied by the simulation model; the resulting regression model is also known as a metamodel or a response surface. Regression analysis gives better results when the simu- lation experiment is well designed, using either classical statistical designs (such as frac- tional factorials) or optimal designs (such as pioneered by Fedorov, Kiefer, and Wolfo- witz). To optimize the simulated system, the analysts may apply Response Surface Metho- dology (RSM); RSM combines regression analysis, statistical designs, and steepest-ascent hill-climbing. To validate a simulation model, again regression analysis and statistical designs may be applied. Several numerical examples and case-studies illustrate how statisti- cal techniques can reduce the ad hoc character of simulation; that is, these statistical techniques can make simulation studies give more general results, in less time. Appendix 1 summarizes confidence intervals for expected values, proportions, and quantiles, in termi- nating and steady-state simulations. Appendix 2 gives details on four variance reduction techniques, namely common pseudorandom numbers, antithetic numbers, control variates or regression sampling, and importance sampling. Appendix 3 describes jackknifing, which may give robust confidence intervals.least squares;distribution-free;non-parametric;stopping rule;run-length;Von Neumann;median;seed;likelihood ratio

    Experimental Design of a Prescribed Burn Instrumentation

    Full text link
    Observational data collected during experiments, such as the planned Fire and Smoke Model Evaluation Experiment (FASMEE), are critical for progressing and transitioning coupled fire-atmosphere models like WRF-SFIRE and WRF-SFIRE-CHEM into operational use. Historical meteorological data, representing typical weather conditions for the anticipated burn locations and times, have been processed to initialize and run a set of simulations representing the planned experimental burns. Based on an analysis of these numerical simulations, this paper provides recommendations on the experimental setup that include the ignition procedures, size and duration of the burns, and optimal sensor placement. New techniques are developed to initialize coupled fire-atmosphere simulations with weather conditions typical of the planned burn locations and time of the year. Analysis of variation and sensitivity analysis of simulation design to model parameters by repeated Latin Hypercube Sampling are used to assess the locations of the sensors. The simulations provide the locations of the measurements that maximize the expected variation of the sensor outputs with the model parameters.Comment: 35 pages, 4 tables, 28 figure
    corecore