25 research outputs found

    A method for augmenting supersaturated designs

    Get PDF
    Initial screening experiments often leave some problems unresolved, adding follow-up runs is needed to clarify the initial results. In this paper, a technique is developed to add additional experimental runs to an initial supersaturated design. The added runs are generated with respect to the Bayesian DsD_s-optimality criterion and the procedure can incorporate the model information from the initial design. After analysis of the initial experiment with several methods, factors are classified into three groups: primary, secondary, and potential according to the times that they have been identified. The focus is on those secondary factors since they have been identified several times but not so many that experimenters are sure that they are active, the proposed Bayesian DsD_s-optimal augmented design would minimize the error variances of the parameter estimators of secondary factors. In addition, a blocking factor will be involved to describe the mean shift between two stages. Simulation results show that the method performs very well in certain settings

    A Comparison Study of Second-Order Screening Designs and Their Extension

    Get PDF
    Recent literature has proposed employing a single experimental design capable of preforming both factor screening and response surface estimation when conducting sequential experiments is unrealistic due to time, budget, or other constraints. Military systems, particularly aerodynamic systems, are complex. It is not unusual for these systems to exhibit nonlinear response behavior. Developmental testing may be tasked to characterize the nonlinear behavior of such systems while being restricted in how much testing can be accomplished. Second-order screening designs provide a means in a single design experiment to effectively focus test resources onto those factors driving system performance. Sponsored by the Office of the Secretary of Defense (ODS) in support of the Science of Test initiative, this research characterizes and adds to the area of second-order screening designs, particularly as applied to defense testing. Existing design methods are empirically tested and examined for robustness. The leading design method, a method that is very run efficient, is extended to overcome limitations when screening for non-linear effects. A case study and screening design guidance for defense testers is also provided

    Considerations for Screening Designs and Follow-Up Experimentation

    Get PDF
    The success of screening experiments hinges on the effect sparsity assumption, which states that only a few of the factorial effects of interest actually have an impact on the system being investigated. The development of a screening methodology to harness this assumption requires careful consideration of the strengths and weaknesses of a proposed experimental design in addition to the ability of an analysis procedure to properly detect the major influences on the response. However, for the most part, screening designs and their complementing analysis procedures have been proposed separately in the literature without clear consideration of their ability to perform as a single screening methodology. As a contribution to this growing area of research, this dissertation investigates the pairing of non-replicated and partially–replicated two-level screening designs with model selection procedures that allow for the incorporation of a model-independent error estimate. Using simulation, we focus attention on the ability to screen out active effects from a first order with two-factor interactions model and the possible benefits of using partial replication as part of an overall screening methodology. We begin with a focus on single-criterion optimum designs and propose a new criterion to create partially replicated screening designs. We then extend the newly proposed criterion into a multi-criterion framework where estimation of the assumed model in addition to protection against model misspecification are considered. This is an important extension of the work since initial knowledge of the system under investigation is considered to be poor in the cases presented. A methodology to reduce a set of competing design choices is also investigated using visual inspection of plots meant to represent uncertainty in design criterion preferences. Because screening methods typically involve sequential experimentation, we present a final investigation into the screening process by presenting simulation results which incorporate a single follow-up phase of experimentation. In this concluding work we extend the newly proposed criterion to create optimal partially replicated follow-up designs. Methodologies are compared which use different methods of incorporating knowledge gathered from the initial screening phase into the follow-up phase of experimentation

    Design and Analysis of Screening Experiments Assuming Effect Sparsity

    Get PDF
    Many initial experiments for industrial and engineering applications employ screening designs to determine which of possibly many factors are significant. These screening designs are usually a highly fractionated factorial or a Plackett-Burman design that focus on main effects and provide limited information for interactions. To help simplify the analysis of these experiments, it is customary to assume that only a few of the effects are actually important; this assumption is known as ‘effect sparsity’. This dissertation will explore both design and analysis aspects of screening experiments assuming effect sparsity. In 1989, Russell Lenth proposed a method for analyzing unreplicated factorials that has become popular due to its simplicity and satisfactory power relative to alternative methods. We propose and illustrate the use of p-values, estimated by simulation, for Lenth t-statistics. This approach is recommended for its versatility. Whereas tabulated critical values are restricted to the case of uncorrelated estimates, we illustrate the use of p-values for both orthogonal and nonorthogonal designs. For cases where there is limited replication, we suggest computing t-statistics and p-values using an estimator that combines the pure error mean square with a modified Lenth’s pseudo standard error. Supersaturated designs (SSDs) are designs that examine more factors than runs available. SSDs were introduced to handle situations in which a large number of factors are of interest but runs are expensive or time-consuming. We begin by assessing the null model performance of SSDs when using all-subsets and forward selection regression. The propensity for model selection criteria to overfit is highlighted. We subsequently propose a strategy for analyzing SSDs that combines all-subsets regression and permutation tests. The methods are illustrated for several examples. In contrast to the usual sequential nature of response surface methods (RSM), recent literature has proposed both screening and response surface exploration using only one three-level design. This approach is named “one-step RSM”. We discuss and illustrate two shortcomings of the current one-step RSM designs and analysis. Subsequently, we propose a new class of three-level designs and an analysis strategy unique to these designs that will address these shortcomings and aid the user in being appropriately advised as to factor importance. We illustrate the designs and analysis with simulated and real data

    Augmenting Definitive Screening Designs

    Get PDF
    Design of experiments is used to study the relationship between one or more response variables and several factors whose levels are varied. Response surface methodology (RSM) employs the design of experiment techniques to decide if changes in design variables can enhance or optimize a process. They are usually analyzed by fitting a second-order polynomial model. Some standard and classical response surface designs are 3k3^k Factorial Designs, Central Composite Designs (CCDs), and Box-Behnken Designs (BBDs). They can all be used to fit a second-order polynomial model efficiently and allow for some testing of the model\u27s lack of fit. When performing multiple experiments is not feasible due to time, budget, or other constraints, recent literature suggests using a single experimental design capable of performing both factor screening and surface response exploration. Definitive Screening Designs (DSDs) are well-known experimental designs with three levels. They are also named second-order screening designs, and they can estimate a second-order model in any subsets of three factors. However, when the design has more than three active factors, only the linear main effects and perhaps the largest second-order term can be identified by a DSD. Also, they may have trouble identifying active pure quadratic effects when two-factor interactions are present. In this dissertation, We propose several methods for augmenting definitive screening designs for improving estimability and efficiency. Improved sensitivity and specificity are also highlighted

    A User's Guide to the Brave New World of Designing Simulation Experiments

    Get PDF
    Many simulation practitioners can get more from their analyses by using the statistical theory on design of experiments (DOE) developed specifically for exploring computer models.In this paper, we discuss a toolkit of designs for simulationists with limited DOE expertise who want to select a design and an appropriate analysis for their computational experiments.Furthermore, we provide a research agenda listing problems in the design of simulation experiments -as opposed to real world experiments- that require more investigation.We consider three types of practical problems: (1) developing a basic understanding of a particular simulation model or system; (2) finding robust decisions or policies; and (3) comparing the merits of various decisions or policies.Our discussion emphasizes aspects that are typical for simulation, such as sequential data collection.Because the same problem type may be addressed through different design types, we discuss quality attributes of designs.Furthermore, the selection of the design type depends on the metamodel (response surface) that the analysts tentatively assume; for example, more complicated metamodels require more simulation runs.For the validation of the metamodel estimated from a specific design, we present several procedures.

    State-of-the-Art Review : a user's guide to the brave new world of designing simulation experiments

    Get PDF
    Many simulation practitioners can get more from their analyses by using the statistical theory on design of experiments (DOE) developed specifically for exploring computer models. We discuss a toolkit of designs for simulators with limited DOE expertise who want to select a design and an appropriate analysis for their experiments. Furthermore, we provide a research agenda listing problems in the design of simulation experiments¿as opposed to real-world experiments¿that require more investigation. We consider three types of practical problems: (1) developing a basic understanding of a particular simulation model or system, (2) finding robust decisions or policies as opposed to so-called optimal solutions, and (3) comparing the merits of various decisions or policies. Our discussion emphasizes aspects that are typical for simulation, such as having many more factors than in real-world experiments, and the sequential nature of the data collection. Because the same problem type may be addressed through different design types, we discuss quality attributes of designs, such as the ease of design construction, the flexibility for analysis, and efficiency considerations. Moreover, the selection of the design type depends on the metamodel (response surface) that the analysts tentatively assume; for example, complicated metamodels require more simulation runs. We present several procedures to validate the metamodel estimated from a specific design, and we summarize a case study illustrating several of our major themes. We conclude with a discussion of areas that merit more work to achieve the potential benefits¿either via new research or incorporation into standard simulation or statistical packages

    Geostatistical integration of geophysical, well bore and outcrop data for flow modeling of a deltaic reservoir analogue

    Get PDF
    Significant world oil and gas reserves occur in deltaic reservoirs. Characterization of deltaic reservoirs requires understanding sedimentary and diagenetic heterogeneity at the submeter scale in three dimensions. However, deltaic facies architecture is complex and poorly understood. Moreover, precipitation of extensive calcite cement during diagenesis can modify the depositional permeability of sandstone reservoir and affect fluid flow. Heterogeneity contributes to trapping a significant portion of mobile oil in deltaic reservoirs analogous of Cretaceous Frontier Formation, Powder River Basin, Wyoming. This dissertation focuses on 3D characterization of an ancient deltaic lobe. The Turonian Wall Creek Member in central Wyoming has been selected for the present study, which integrates outcrop digitized image analysis, 2D and 3D interpreted ground penetrating radar surveys, outcrop gamma ray measurements, well logs, permeameter logs and transects, and other data for 3D reservoir characterization and flow modeling. Well log data are used to predict the geological facies using beta-Bayes method and classic multivariate statistic methods, and predictions are compared with the outcrop description. Geostatistical models are constructed for the size, orientation, and shape of the concretions using interpreted GPR, well, and outcrop data. The spatial continuity of concretions is quantified using photomosaic derived variogram analysis. Relationships among GRP attributes, well data, and outcrop data are investigated, including calcite concretion occurrence and permeability measurements from outcrop. A combination of truncated Gaussian simulation and Bayes rule predicts 3D concretion distributions. Comparisons between 2D flow simulations based on outcrop observations and an ensemble of geostatistical models indicates that the proposed approach can reproduce essential aspects of flow behavior in this system. Experimental design, analysis of variance, and flow simulations examine the effects of geological variability on breakthrough time, sweep efficiency and upscaled permeability. The proposed geostatistical and statistical methods can improve prediction of flow behavior even if conditioning data are sparse and radar data are noisy. The derived geostatistical models of stratigraphy, facies and diagenesis are appropriate for analogous deltaic reservoirs. Furthermore, the results can guide data acquisition, improve performance prediction, and help to upscale models

    Experiment Planning for Protein Structure Elucidation and Site-Directed Protein Recombination

    Get PDF
    In order to most effectively investigate protein structure and improve protein function, it is necessary to carefully plan appropriate experiments. The combinatorial number of possible experiment plans demands effective criteria and efficient algorithms to choose the one that is in some sense optimal. This thesis addresses experiment planning challenges in two significant applications. The first part of this thesis develops an integrated computational-experimental approach for rapid discrimination of predicted protein structure models by quantifying their consistency with relatively cheap and easy experiments (cross-linking and site-directed mutagenesis followed by stability measurement). In order to obtain the most information from noisy and sparse experimental data, rigorous Bayesian frameworks have been developed to analyze the information content. Efficient algorithms have been developed to choose the most informative, least expensive, and most robust experiments. The effectiveness of this approach has been demonstrated using existing experimental data as well as simulations, and it has been applied to discriminate predicted structure models of the pTfa chaperone protein from bacteriophage lambda. The second part of this thesis seeks to choose optimal breakpoint locations for protein engineering by site-directed recombination. In order to increase the possibility of obtaining folded and functional hybrids in protein recombination, it is necessary to retain the evolutionary relationships among amino acids that determine protein stability and functionality. A probabilistic hypergraph model has been developed to model these relationships, with edge weights representing their statistical significance derived from database and a protein family. The effectiveness of this model has been validated by showing its ability to distinguish functional hybrids from non-functional ones in existing experimental data. It has been proved to be NP-hard in general to choose the optimal breakpoint locations for recombination that minimize the total perturbation to these relationships, but exact and approximate algorithms have been developed for a number of important cases
    corecore