32 research outputs found

    Tailoring the Statistical Experimental Design Process for LVC Experiments

    Get PDF
    The use of Live, Virtual and Constructive (LVC) Simulation environments are increasingly being examined for potential analytical use particularly in test and evaluation. The LVC simulation environments provide a mechanism for conducting joint mission testing and system of systems testing when scale and resource limitations prevent the accumulation of the necessary density and diversity of assets required for these complex and comprehensive tests. The statistical experimental design process is re-examined for potential application to LVC experiments and several additional considerations are identified to augment the experimental design process for use with LVC. This augmented statistical experimental design process is demonstrated by a case study involving a series of tests on an experimental data link for strike aircraft using LVC simulation for the test environment. The goal of these tests is to assess the usefulness of information being presented to aircrew members via different datalink capabilities. The statistical experimental design process is used to structure the experiment leading to the discovery of faulty assumptions and planning mistakes that could potentially wreck the results of the experiment. Lastly, an aggressive sequential experimentation strategy is presented for LVC experiments when test resources are limited. This strategy depends on a foldover algorithm that we developed for nearly orthogonal arrays to rescue LVC experiments when important factor effects are confounded

    Recent Developments in Nonregular Fractional Factorial Designs

    Full text link
    Nonregular fractional factorial designs such as Plackett-Burman designs and other orthogonal arrays are widely used in various screening experiments for their run size economy and flexibility. The traditional analysis focuses on main effects only. Hamada and Wu (1992) went beyond the traditional approach and proposed an analysis strategy to demonstrate that some interactions could be entertained and estimated beyond a few significant main effects. Their groundbreaking work stimulated much of the recent developments in design criterion creation, construction and analysis of nonregular designs. This paper reviews important developments in optimality criteria and comparison, including projection properties, generalized resolution, various generalized minimum aberration criteria, optimality results, construction methods and analysis strategies for nonregular designs.Comment: Submitted to the Statistics Surveys (http://www.i-journals.org/ss/) by the Institute of Mathematical Statistics (http://www.imstat.org

    Aberration in qualitative multilevel designs

    Full text link
    Generalized Word Length Pattern (GWLP) is an important and widely-used tool for comparing fractional factorial designs. We consider qualitative factors, and we code their levels using the roots of the unity. We write the GWLP of a fraction F{\mathcal F} using the polynomial indicator function, whose coefficients encode many properties of the fraction. We show that the coefficient of a simple or interaction term can be written using the counts of its levels. This apparently simple remark leads to major consequence, including a convolution formula for the counts. We also show that the mean aberration of a term over the permutation of its levels provides a connection with the variance of the level counts. Moreover, using mean aberrations for symmetric sms^m designs with ss prime, we derive a new formula for computing the GWLP of F{\mathcal F}. It is computationally easy, does not use complex numbers and also provides a clear way to interpret the GWLP. As case studies, we consider non-isomorphic orthogonal arrays that have the same GWLP. The different distributions of the mean aberrations suggest that they could be used as a further tool to discriminate between fractions.Comment: 16 pages, 1 figur

    A scenario for sequential experimentation

    Get PDF
    Statistical Methods

    Split-plot designs: What, why, and how

    Get PDF
    The past decade has seen rapid advances in the development of new methods for the design and analysis of split-plot experiments. Unfortunately, the value of these designs for industrial experimentation has not been fully appreciated. In this paper, we review recent developments and provide guidelines for the use of split-plot designs in industrial applications

    Generalized construction of trend resistent 2-level split-plot designs

    Get PDF
    Common experimental practices suggest randomizing the order in which runs are performed. However, there may be situations in which randomization might not produce the most desirable order, especially in the presence of known trends. There has been research done on systematically designing experiments to be robust against trends. However, few studies address the additional dimensions that arise in nested designs such as split-plot designs. Split-plot designs have been used for many years in agricultural applications and are sometimes preferred where there are hard-to-change factors in industrial settings. There currently is no established methodology to produce split-plot designs that are robust to potential two-dimensional trends. The objective of this work is to develop a methodology to design run orders for two-level, split-plot (2w Ă— 2s) designs that are robust or nearly robust against a set of trends. Two methods are developed in this work. A fold-over method that uses already established principles is extended for use in split-plot designs. The second method uses an integer linear programming approach to search for an optimal design that is resistant to specific trends. A comparison between the two methods is presented and evaluated with a proposed set of metrics

    Use of orthogonal arrays, quasi-Monte Carlo sampling and kriging response models for reservoir simulation with many varying factors

    Get PDF
    Asset development teams may adjust simulation model parameters using experimental design to reveal which factors have the greatest impact on the reservoir performance. Response surfaces and experimental design make sensitivity analysis less expensive and more accurate, helping to optimize recovery under geological and economical uncertainties. In this thesis, experimental designs including orthogonal arrays, factorial designs, Latin hypercubes and Hammersley sequences are compared and analyzed. These methods are demonstrated for a gas well with water coning problem to illustrate the efficiency of orthogonal arrays. Eleven geologic factors are varied while optimizing three engineering factors (total of fourteen factors). The objective is to optimize completion length, tubing head pressure, and tubing diameter for a partially penetrating well with uncertain reservoir properties. A nearly orthogonal array was specified with three levels for eight factors and four levels for the remaining six geologic and engineering factors. This design requires only 36 simulations compared to (26,873,856) runs for a full factorial design. Hyperkriging surfaces are an alternative model form for large numbers. Hyperkriging uses the maximum likelihood variogram model parameters to minimize prediction errors. Kriging is compared to conventional polynomial response models. The robustness of the response surfaces generated by kriging and polynomial regression are compared using jackknifing and bootstrapping. Sensitivity analysis and uncertainty analysis can be performed inexpensively and efficiently using response surfaces. The proposed design approach requires fewer simulations and provides accurate response models, efficient optimization, and flexible sensitivity and uncertainty assessment
    corecore