112 research outputs found

    Validating Sample Average Approximation Solutions with Negatively Dependent Batches

    Full text link
    Sample-average approximations (SAA) are a practical means of finding approximate solutions of stochastic programming problems involving an extremely large (or infinite) number of scenarios. SAA can also be used to find estimates of a lower bound on the optimal objective value of the true problem which, when coupled with an upper bound, provides confidence intervals for the true optimal objective value and valuable information about the quality of the approximate solutions. Specifically, the lower bound can be estimated by solving multiple SAA problems (each obtained using a particular sampling method) and averaging the obtained objective values. State-of-the-art methods for lower-bound estimation generate batches of scenarios for the SAA problems independently. In this paper, we describe sampling methods that produce negatively dependent batches, thus reducing the variance of the sample-averaged lower bound estimator and increasing its usefulness in defining a confidence interval for the optimal objective value. We provide conditions under which the new sampling methods can reduce the variance of the lower bound estimator, and present computational results to verify that our scheme can reduce the variance significantly, by comparison with the traditional Latin hypercube approach

    Sliced rotated sphere packing designs

    Full text link
    Space-filling designs are popular choices for computer experiments. A sliced design is a design that can be partitioned into several subdesigns. We propose a new type of sliced space-filling design called sliced rotated sphere packing designs. Their full designs and subdesigns are rotated sphere packing designs. They are constructed by rescaling, rotating, translating and extracting the points from a sliced lattice. We provide two fast algorithms to generate such designs. Furthermore, we propose a strategy to use sliced rotated sphere packing designs adaptively. Under this strategy, initial runs are uniformly distributed in the design space, follow-up runs are added by incorporating information gained from initial runs, and the combined design is space-filling for any local region. Examples are given to illustrate its potential application

    Design of Experiments for Screening

    Full text link
    The aim of this paper is to review methods of designing screening experiments, ranging from designs originally developed for physical experiments to those especially tailored to experiments on numerical models. The strengths and weaknesses of the various designs for screening variables in numerical models are discussed. First, classes of factorial designs for experiments to estimate main effects and interactions through a linear statistical model are described, specifically regular and nonregular fractional factorial designs, supersaturated designs and systematic fractional replicate designs. Generic issues of aliasing, bias and cancellation of factorial effects are discussed. Second, group screening experiments are considered including factorial group screening and sequential bifurcation. Third, random sampling plans are discussed including Latin hypercube sampling and sampling plans to estimate elementary effects. Fourth, a variety of modelling methods commonly employed with screening designs are briefly described. Finally, a novel study demonstrates six screening methods on two frequently-used exemplars, and their performances are compared

    Exploratory ensemble designs for environmental models using k-extended Latin Hypercubes

    Get PDF
    Copyright © 2015 John Wiley & Sons, Ltd.publication-status: AcceptedOpen Access articleIn this paper we present a novel, flexible, and multi-purpose class of designs for initial exploration of the parameter spaces of computer models, such as those used to study many features of the environment. The idea applies existing technology aimed at expanding a Latin Hypercube (LHC) in order to generate initial LHC designs that are composed of many smaller LHCs. The resulting design and its component parts are designed so that each is approximately orthogonal and maximises a measure of coverage of the parameter space. Designs of the type advocated for in this paper are particularly useful when we want to simultaneously quantify parametric uncertainty and any uncertainty due to the initial conditions, boundary conditions, or forcing functions required to run the model. This makes the class of designs particularly suited to environmental models, such as climate models that contain all of these features. The proposed designs are particularly suited to initial exploratory ensembles whose goal is to guide the design of further ensembles aimed at, for example, calibrating the model. We introduce a new emulator diagnostic that exploits the structure of the advocated ensemble designs and allows for the assessment of structural weaknesses in the statistical modelling. We provide illustrations of the method through a simple example and describe a 400 member ensemble of the Nucleus for European Modelling of the Ocean (NEMO) ocean model designed using the method. We build an emulator for NEMO using the created design to illustrate the use of our emulator diagnostic test.Engineering and Physical Sciences Research Council (EPSRC

    Rotated sphere packing designs

    Full text link
    We propose a new class of space-filling designs called rotated sphere packing designs for computer experiments. The approach starts from the asymptotically optimal positioning of identical balls that covers the unit cube. Properly scaled, rotated, translated and extracted, such designs are excellent in maximin distance criterion, low in discrepancy, good in projective uniformity and thus useful in both prediction and numerical integration purposes. We provide a fast algorithm to construct such designs for any numbers of dimensions and points with R codes available online. Theoretical and numerical results are also provided

    Designs for computer experiments and uncertainty quantification

    Get PDF
    Computer experiments are widely-used in analysis of real systems where physical experiments are infeasible or unaffordable. Computer models are usually complex and computationally demanding, consequently, time consuming to run. Therefore, surrogate models, also known as emulators, are fitted to approximate these computationally intensive computer models. Since emulators are easy-to-evaluate they may replace computer models in the actual analysis of the systems. Experimental design for computer simulations and modeling of simulated outputs are two important aspects of building accurate emulators. This thesis consists of three chapters, covering topics in design of computer experiments and uncertainty quantification of complex computer models. The first chapter proposes a new type of space-filling designs for computer experiments, and the second chapter develops an emulator-based approach for uncertainty quantification of machining processes using their computer simulations. Finally, third chapter extends the experimental designs proposed in the first chapter and enables to generate designs with both quantitative and qualitative factors. In design of computer experiments, space-filling properties are important. The traditional maximin and minimax distance designs consider only space-fillingness in the full-dimensional space which can result in poor projections onto lower-dimensional spaces, which is undesirable when only a few factors are active. On the other hand, restricting maximin distance design to the class of Latin hypercubes can improve one-dimensional projections but cannot guarantee good space-filling properties in larger subspaces. In the first chapter, we propose designs that maximize space-filling properties on projections to all subsets of factors. Proposed designs are called maximum projection designs. Maximum projection designs have better space-filling properties in their projections compared to other widely-used space-filling designs. They also provide certain advantages in Gaussian process modeling. More importantly, the design criterion can be computed at a cost no more than that of a design criterion which ignores projection properties. In the second chapter, we develop an uncertainty quantification methodology for machining processes with uncertain input factors. Understanding the uncertainty in a machining process using the simulation outputs is important for careful decision making. However, Monte Carlo-based methods cannot be used for evaluating the uncertainty when the simulations are computationally expensive. An alternative approach is to build an easy-to-evaluate emulator to approximate the computer model and run the Monte Carlo simulations on the emulator. Although this approach is very promising, it becomes inefficient when the computer model is highly nonlinear and the region of interest is large. Most machining simulations are of this kind because the output is affected by a large number of parameters including the workpiece material properties, cutting tool parameters, and process parameters. Building an accurate emulator that works for different kinds of materials, tool designs, tool paths, etc. is not an easy task. We propose a new approach, called in-situ emulator, to overcome this problem. The idea is to build an emulator in a local region defined by the user-specified input uncertainty distribution. We use maximum projection designs and Gaussian process modeling techniques for constructing the in-situ emulator. On two solid end milling processes, we show that the in-situ emulator methodology is efficient and accurate in uncertainty quantification and has apparent advantages over other conventional tools. Computer experiments with quantitative and qualitative factors are prevalent. In the third chapter, we extend maximum projection designs so that they can accommodate qualitative factors as well. Proposed designs maintain an economic run size and they are flexible in run size, number of quantitative and qualitative factors and factor levels. Their construction is not restricted to a special design class and does not impose any design configuration. A general construction algorithm, which utilizes orthogonal arrays, is developed. We have shown on several simulations that maximum projection designs with both quantitative and qualitative factors have attractive space-filling properties for all of their projections. Their advantages are also illustrated on optimization of a solid end milling process simulation. Finally, we propose a methodology for sequential construction of maximum projection designs which ensures efficient analysis of systems within financial cost and time constraints. The performance of the sequential construction methodology is demonstrated using the optimization of a solid end milling process.Ph.D

    Estimates of the coverage of parameter space by Latin Hypercube and Orthogonal Array-based sampling

    Get PDF
    In this paper we use counting arguments to prove that the expected percentage coverage of a d dimensional parameter space of size n when performing k trials with either Latin Hypercube sampling or Orthogonal Array-based Latin Hypercube sampling is the same. We then extend these results to an experimental design setting by projecting onto a t < d dimensional subspace. These results are confirmed by simulations. The theory presented has both theoretical and practical significance in modelling and simulation science when sampling over high dimensional spaces. (C) 2017 Elsevier Inc. All rights reserved
    corecore