4,686 research outputs found

    Very large fractional factorial and central composite designs

    Get PDF
    The article of record as published may be located at http://dx.doi.org/10.1145/1113316.1113320We present a concise representation of fractional factorials and an algorithm to quickly generate resolution V designs. The description is based on properties of a complete, orthogonal discrete-values basis set called Walsh functions. We tabulate two-level resolution V fractional factorial designs, as well as central composite designs allowing estimation of full second-order models, for experiments involving up to 120 factors. The simple algorithm provided can be used to characterize even larger designs, and a fast Walsh transform method quickly generates design matrices from our representation

    A robust design methodology suitable for application to one-off products

    Get PDF
    Robust design is an activity of fundamental importance when designing large, complex, one-off engineering products. Work is described which is concerned with the application of the theory of design of experiments and stochastic optimization methods to explore and optimize at the concept design stage. The discussion begins with a description of state-of-the-art stochastic techniques and their application to robust design. The content then focuses on a generic methodology which is capable of manipulating design algorithms that can be used to describe a design concept. An example is presented, demonstrating the use of the system for the robust design of a catamaran with respect to seakeeping

    Modelling multi-tier enterprise applications behaviour with design of experiments technique

    Get PDF
    Queueing network models are commonly used for performance modelling. However, through application development stage analytical models might not be able to continuously reflect performance, for example due to performance bugs or minor changes in the application code that cannot be readily reflected in the queueing model. To cope with this problem, a measurement-based approach adopting Design of Experiments (DoE) technique is proposed. The applicability of the proposed method is demonstrated on a complex 3-tier e-commerce application that is difficult to model with queueing networks

    A demonstration of the utility of fractional experimental design for finding optimal genetic algorithm parameter settings

    Get PDF
    This paper demonstrates that the use of sparse experimental design in the development of the structure for genetic algorithms, and hence other computer programs, is a particularly effective and efficient strategy. Despite widespread knowledge of the existence of these systematic experimental plans, they have seen limited application in the investigation of advanced computer programs. This paper attempts to address this missed opportunity and encourage others to take advantage of the power of these plans. Using data generated from a full factorial experimental design, involving 27 experimental runs that was used to assess the optimum operating settings of the parameters of a special genetic algorithm (GA), we show that similar results could have been obtained using as few as nine runs. The GA was used to find minimum cost schedules for a complex component assembly operation with many sub-processes

    Regression Models and Experimental Designs: A Tutorial for Simulation Analaysts

    Get PDF
    This tutorial explains the basics of linear regression models. especially low-order polynomials. and the corresponding statistical designs. namely, designs of resolution III, IV, V, and Central Composite Designs (CCDs).This tutorial assumes 'white noise', which means that the residuals of the fitted linear regression model are normally, independently, and identically distributed with zero mean.The tutorial gathers statistical results that are scattered throughout the literature on mathematical statistics, and presents these results in a form that is understandable to simulation analysts.metamodels;fractional factorial designs;Plackett-Burman designs;factor interactions;validation;cross-validation

    A scenario for sequential experimentation

    Get PDF
    Statistical Methods

    Data Driven Surrogate Based Optimization in the Problem Solving Environment WBCSim

    Get PDF
    Large scale, multidisciplinary, engineering designs are always difficult due to the complexity and dimensionality of these problems. Direct coupling between the analysis codes and the optimization routines can be prohibitively time consuming due to the complexity of the underlying simulation codes. One way of tackling this problem is by constructing computationally cheap(er) approximations of the expensive simulations, that mimic the behavior of the simulation model as closely as possible. This paper presents a data driven, surrogate based optimization algorithm that uses a trust region based sequential approximate optimization (SAO) framework and a statistical sampling approach based on design of experiment (DOE) arrays. The algorithm is implemented using techniques from two packages—SURFPACK and SHEPPACK that provide a collection of approximation algorithms to build the surrogates and three different DOE techniques—full factorial (FF), Latin hypercube sampling (LHS), and central composite design (CCD)—are used to train the surrogates. The results are compared with the optimization results obtained by directly coupling an optimizer with the simulation code. The biggest concern in using the SAO framework based on statistical sampling is the generation of the required database. As the number of design variables grows, the computational cost of generating the required database grows rapidly. A data driven approach is proposed to tackle this situation, where the trick is to run the expensive simulation if and only if a nearby data point does not exist in the cumulatively growing database. Over time the database matures and is enriched as more and more optimizations are performed. Results show that the proposed methodology dramatically reduces the total number of calls to the expensive simulation runs during the optimization process
    corecore