4,489 research outputs found

    Validating Sample Average Approximation Solutions with Negatively Dependent Batches

    Full text link
    Sample-average approximations (SAA) are a practical means of finding approximate solutions of stochastic programming problems involving an extremely large (or infinite) number of scenarios. SAA can also be used to find estimates of a lower bound on the optimal objective value of the true problem which, when coupled with an upper bound, provides confidence intervals for the true optimal objective value and valuable information about the quality of the approximate solutions. Specifically, the lower bound can be estimated by solving multiple SAA problems (each obtained using a particular sampling method) and averaging the obtained objective values. State-of-the-art methods for lower-bound estimation generate batches of scenarios for the SAA problems independently. In this paper, we describe sampling methods that produce negatively dependent batches, thus reducing the variance of the sample-averaged lower bound estimator and increasing its usefulness in defining a confidence interval for the optimal objective value. We provide conditions under which the new sampling methods can reduce the variance of the lower bound estimator, and present computational results to verify that our scheme can reduce the variance significantly, by comparison with the traditional Latin hypercube approach

    Latin hypercube sampling with dependence and applications in finance

    Get PDF
    In Monte Carlo simulation, Latin hypercube sampling (LHS) [McKay et al. (1979)] is a well-known variance reduction technique for vectors of independent random variables. The method presented here, Latin hypercube sampling with dependence (LHSD), extends LHS to vectors of dependent random variables. The resulting estimator is shown to be consistent and asymptotically unbiased. For the bivariate case and under some conditions on the joint distribution, a central limit theorem together with a closed formula for the limit variance are derived. It is shown that for a class of estimators satisfying some monotonicity condition, the LHSD limit variance is never greater than the corresponding Monte Carlo limit variance. In some valuation examples of financial payoffs, when compared to standard Monte Carlo simulation, a variance reduction of factors up to 200 is achieved. LHSD is suited for problems with rare events and for high-dimensional problems, and it may be combined with Quasi-Monte Carlo methods. --Monte Carlo simulation,variance reduction,Latin hypercube sampling,stratified sampling

    Experimental Design of a Prescribed Burn Instrumentation

    Full text link
    Observational data collected during experiments, such as the planned Fire and Smoke Model Evaluation Experiment (FASMEE), are critical for progressing and transitioning coupled fire-atmosphere models like WRF-SFIRE and WRF-SFIRE-CHEM into operational use. Historical meteorological data, representing typical weather conditions for the anticipated burn locations and times, have been processed to initialize and run a set of simulations representing the planned experimental burns. Based on an analysis of these numerical simulations, this paper provides recommendations on the experimental setup that include the ignition procedures, size and duration of the burns, and optimal sensor placement. New techniques are developed to initialize coupled fire-atmosphere simulations with weather conditions typical of the planned burn locations and time of the year. Analysis of variation and sensitivity analysis of simulation design to model parameters by repeated Latin Hypercube Sampling are used to assess the locations of the sensors. The simulations provide the locations of the measurements that maximize the expected variation of the sensor outputs with the model parameters.Comment: 35 pages, 4 tables, 28 figure

    A Study For Efficiently Solving Optimisation Problems With An Increasing Number Of Design Variables

    Get PDF
    Coupling optimisation algorithms to Finite Element Methods (FEM) is a very promising way to achieve optimal metal forming processes. However, many optimisation algorithms exist and it is not clear which of these algorithms to use. This paper investigates the sensitivity of a Sequential Approximate Optimisation algorithm (SAO) proposed in [1-4] to an increasing number of design variables and compares it with two other algorithms: an Evolutionary Strategy (ES) and an Evolutionary version of the SAO (ESAO). In addition, it observes the influence of different Designs Of Experiments used with the SAO. It is concluded that the SAO is very capable and efficient and its combination with an ES is not beneficial. Moreover, the use of SAO with Fractional Factorial Design is the most efficient method, rather than Full Factorial Design as proposed in [1-4]
    • 

    corecore