1,330 research outputs found

    Recent Developments in Nonregular Fractional Factorial Designs

    Full text link
    Nonregular fractional factorial designs such as Plackett-Burman designs and other orthogonal arrays are widely used in various screening experiments for their run size economy and flexibility. The traditional analysis focuses on main effects only. Hamada and Wu (1992) went beyond the traditional approach and proposed an analysis strategy to demonstrate that some interactions could be entertained and estimated beyond a few significant main effects. Their groundbreaking work stimulated much of the recent developments in design criterion creation, construction and analysis of nonregular designs. This paper reviews important developments in optimality criteria and comparison, including projection properties, generalized resolution, various generalized minimum aberration criteria, optimality results, construction methods and analysis strategies for nonregular designs.Comment: Submitted to the Statistics Surveys (http://www.i-journals.org/ss/) by the Institute of Mathematical Statistics (http://www.imstat.org

    Regularities in the Augmentation of Fractional Factorial Designs

    Get PDF
    Two-level factorial experiments are widely used in experimental design because they are simple to construct and interpret while also being efficient. However, full factorial designs for many factors can quickly become inefficient, time consuming, or expensive and therefore fractional factorial designs are sometimes preferable since they provide information on effects of interest and can be performed in fewer experimental runs. The disadvantage of using these designs is that when using fewer experimental runs, information about effects of interest is sometimes lost. Although there are methods for selecting fractional designs so that the number of runs is minimized while the amount of information provided is maximized, sometimes the design must be augmented with a follow-up experiment to resolve ambiguities. Using a fractional factorial design augmented with an optimal follow-up design allows for many factors to be studied using only a small number of additional experimental runs, compared to the full factorial design, without a loss in the amount of information that can be gained about the effects of interest. This thesis looks at discovering regularities in the number of follow-up runs that are needed to estimate all aliased effects in the model of interest for 4-, 5-, 6-, and 7-factor resolution III and IV fractional factorial experiments. From this research it was determined that for all of the resolution IV designs, four or fewer (typically three) augmented runs would estimate all of the aliased effects in the model of interest. In comparison, all of the resolution III designs required seven or eight follow-up runs to estimate all of the aliased effects of interest. It was determined that D-optimal follow-up experiments were significantly better with respect to run size economy versus fold-over and semi-foldover designs for (i) resolution IV designs and (ii) designs with larger run sizes

    UTILIZING DESIGN STRUCTURE FOR IMPROVING DESIGN SELECTION AND ANALYSIS

    Get PDF
    Recent work has shown that the structure for design plays a role in the simplicity or complexity of data analysis. To increase the knowledge of research in these areas, this dissertation aims to utilize design structure for improving design selection and analysis. In this regard, minimal dependent sets and block diagonal structure are both important concepts that are relevant to the orthogonality of the columns of a design. We are interested in finding ways to improve the data analysis especially for active effect detection by utilizing minimal dependent sets and block diagonal structure for design. We introduce a new classification criterion for minimal dependent sets to enhance existing criteria for design selection. The block diagonal structure of certain nonregular designs will also be discussed as a means of improving model selection. In addition, the block diagonal structure and the concept of parallel flats will be utilized to construct three-quarter nonregular designs. Based on the literature review on the effectiveness of the simulation study for slight the light on the success or failure of the proposed statistical method, in this dissertation, simulation studies were used to evaluate the efficacy of our proposed methods. The simulation results show that the minimal dependent sets can be used as a design selection criterion, and block-diagonal structure can also help to produce an effective model selection procedure. In addition, we found a strategy for constructing three-quarters of nonregular designs which depend on the orthogonality of the design columns. The results indicate that the structure of the design has an impact on developing data analysis and design selections. On this basis, it is recommended that analysts consider the structure of the design as a key factor in order to improve the analysis. Further research is needed to determine more concepts related to the structure of the design, which could help to improve data analysis

    A Comparison Study of Second-Order Screening Designs and Their Extension

    Get PDF
    Recent literature has proposed employing a single experimental design capable of preforming both factor screening and response surface estimation when conducting sequential experiments is unrealistic due to time, budget, or other constraints. Military systems, particularly aerodynamic systems, are complex. It is not unusual for these systems to exhibit nonlinear response behavior. Developmental testing may be tasked to characterize the nonlinear behavior of such systems while being restricted in how much testing can be accomplished. Second-order screening designs provide a means in a single design experiment to effectively focus test resources onto those factors driving system performance. Sponsored by the Office of the Secretary of Defense (ODS) in support of the Science of Test initiative, this research characterizes and adds to the area of second-order screening designs, particularly as applied to defense testing. Existing design methods are empirically tested and examined for robustness. The leading design method, a method that is very run efficient, is extended to overcome limitations when screening for non-linear effects. A case study and screening design guidance for defense testers is also provided

    Design and Analysis of Screening Experiments Assuming Effect Sparsity

    Get PDF
    Many initial experiments for industrial and engineering applications employ screening designs to determine which of possibly many factors are significant. These screening designs are usually a highly fractionated factorial or a Plackett-Burman design that focus on main effects and provide limited information for interactions. To help simplify the analysis of these experiments, it is customary to assume that only a few of the effects are actually important; this assumption is known as ‘effect sparsity’. This dissertation will explore both design and analysis aspects of screening experiments assuming effect sparsity. In 1989, Russell Lenth proposed a method for analyzing unreplicated factorials that has become popular due to its simplicity and satisfactory power relative to alternative methods. We propose and illustrate the use of p-values, estimated by simulation, for Lenth t-statistics. This approach is recommended for its versatility. Whereas tabulated critical values are restricted to the case of uncorrelated estimates, we illustrate the use of p-values for both orthogonal and nonorthogonal designs. For cases where there is limited replication, we suggest computing t-statistics and p-values using an estimator that combines the pure error mean square with a modified Lenth’s pseudo standard error. Supersaturated designs (SSDs) are designs that examine more factors than runs available. SSDs were introduced to handle situations in which a large number of factors are of interest but runs are expensive or time-consuming. We begin by assessing the null model performance of SSDs when using all-subsets and forward selection regression. The propensity for model selection criteria to overfit is highlighted. We subsequently propose a strategy for analyzing SSDs that combines all-subsets regression and permutation tests. The methods are illustrated for several examples. In contrast to the usual sequential nature of response surface methods (RSM), recent literature has proposed both screening and response surface exploration using only one three-level design. This approach is named “one-step RSM”. We discuss and illustrate two shortcomings of the current one-step RSM designs and analysis. Subsequently, we propose a new class of three-level designs and an analysis strategy unique to these designs that will address these shortcomings and aid the user in being appropriately advised as to factor importance. We illustrate the designs and analysis with simulated and real data

    Analysis of No-Confounding Designs using the Dantzig Selector

    Get PDF
    abstract: No-confounding designs (NC) in 16 runs for 6, 7, and 8 factors are non-regular fractional factorial designs that have been suggested as attractive alternatives to the regular minimum aberration resolution IV designs because they do not completely confound any two-factor interactions with each other. These designs allow for potential estimation of main effects and a few two-factor interactions without the need for follow-up experimentation. Analysis methods for non-regular designs is an area of ongoing research, because standard variable selection techniques such as stepwise regression may not always be the best approach. The current work investigates the use of the Dantzig selector for analyzing no-confounding designs. Through a series of examples it shows that this technique is very effective for identifying the set of active factors in no-confounding designs when there are three of four active main effects and up to two active two-factor interactions. To evaluate the performance of Dantzig selector, a simulation study was conducted and the results based on the percentage of type II errors are analyzed. Also, another alternative for 6 factor NC design, called the Alternate No-confounding design in six factors is introduced in this study. The performance of this Alternate NC design in 6 factors is then evaluated by using Dantzig selector as an analysis method. Lastly, a section is dedicated to comparing the performance of NC-6 and Alternate NC-6 designs.Dissertation/ThesisMasters Thesis Industrial Engineering 201

    Combining adaptive and designed statistical experimentation : process improvement, data classification, experimental optimization and model building

    Get PDF
    Thesis (Sc. D.)--Massachusetts Institute of Technology, Dept. of Mechanical Engineering, 2009.This electronic version was submitted by the student author. The certified thesis is available in the Institute Archives and Special Collections.Includes bibliographical references.Research interest in the use of adaptive experimentation has returned recently. This historic technique adapts and learns from each experimental run but requires quick runs and large effects. The basis of this renewed interest is to improve experimental response and it is supported by fast, deterministic computer experiments and better post-experiment data analysis. The unifying concept of this thesis is to present and evaluate new ways of using adaptive experimentation combined with the traditional statistical experiment. The first application uses an adaptive experiment as a preliminary step to a more traditional experimental design. This provides experimental redundancy as well as greater model robustness. The number of extra runs is minimal because some are common and yet both methods provide estimates of the best setting. The second use of adaptive experimentation is in evolutionary operation. During regular system operation small, nearly unnoticeable, variable changes can be used to improve production dynamically. If these small changes follow an adaptive procedure there is high likelihood of improvement and integrating into the larger process development. Outside of the experimentation framework the adaptive procedure is shown to combine with other procedures and yield benefit. Two examples used here are an unconstrained numerical optimization procedure as well as classification parameter selection. The final area of new application is to create models that are a combination of an adaptive experiment with a traditional statistical experiment.(cont.) Two distinct areas are examined, first, the use of the adaptive experiment to determine the covariance structure, and second, the direct incorporation of both data sets in an augmented model. Both of these applications are Bayesian with a heavy reliance on numerical computation and simulation to determine the combined model. The two experiments investigated could be performed on the same physical or analytical model but are also extended to situations with different fidelity models. The potential for including non-analytical, even human, models is also discussed. The evaluative portion of this thesis begins with an analytic foundation that outlines the usefulness as well as the limitations of the procedure. This is followed by a demonstration using a simulated model and finally specific examples are drawn from the literature and reworked using the method. The utility of the final result is to provide a foundation to integrate adaptive experimentation with traditional designed experiments. Giving industrial practitioners a solid background and demonstrated foundation should help to codify this integration. The final procedures represent a minimal departure from current practice but represent significant modeling and analysis improvement.by Chad Ryan Foster.Sc.D

    Advanced Statistical Tools for Six Sigma and other Industrial Applications

    Get PDF
    Six Sigma is a methodological approach and philosophy for quality improvement in operations management; its main objectives are identifying and removing the causes of defects, and minimizing variability in manufacturing and business processes. To do so, Six Sigma combines managerial and statistical tools, with the creation of a dedicated organizational structure. In this doctoral thesis and the three years of study and research, we have had the purpose to advance the potential applications of the methodology and its tools; with a specific attention on issues and challenges that typically prevent the realization of the expected financial and operational gains that a company pursue in applying the Six Sigma approach. Small and medium sized enterprises (SMEs), for instance, very often incur into such issues, for structural and infrastructural constraints. The overall application of the methodology in SMEs was the focus of the initial research effort and it has been studied with a case study approach. Then, on this basis, most of our research has been turned to the rigorous methodological advancement of specific statistical tools for Six Sigma, and in a broader sense, for other industrial applications. Specifically, the core contribution of this doctoral thesis lies in the development of both managerial and/or statistical tools for the Six Sigma toolbox. Our work ranges from a decision making tool, which integrates a response latency measure with a well-known procedure for alternatives prioritization; to experimental design tools covering both planning and analysis strategies for screening experiments; to, finally, an initial effort to explore and develop a research agenda based on issues related to conjoint analysis and discrete choice experiments.Six Sigma is a methodological approach and philosophy for quality improvement in operations management; its main objectives are identifying and removing the causes of defects, and minimizing variability in manufacturing and business processes. To do so, Six Sigma combines managerial and statistical tools, with the creation of a dedicated organizational structure. In this doctoral thesis and the three years of study and research, we have had the purpose to advance the potential applications of the methodology and its tools; with a specific attention on issues and challenges that typically prevent the realization of the expected financial and operational gains that a company pursue in applying the Six Sigma approach. Small and medium sized enterprises (SMEs), for instance, very often incur into such issues, for structural and infrastructural constraints. The overall application of the methodology in SMEs was the focus of the initial research effort and it has been studied with a case study approach. Then, on this basis, most of our research has been turned to the rigorous methodological advancement of specific statistical tools for Six Sigma, and in a broader sense, for other industrial applications. Specifically, the core contribution of this doctoral thesis lies in the development of both managerial and/or statistical tools for the Six Sigma toolbox. Our work ranges from a decision making tool, which integrates a response latency measure with a well-known procedure for alternatives prioritization; to experimental design tools covering both planning and analysis strategies for screening experiments; to, finally, an initial effort to explore and develop a research agenda based on issues related to conjoint analysis and discrete choice experiments

    Tailoring the Statistical Experimental Design Process for LVC Experiments

    Get PDF
    The use of Live, Virtual and Constructive (LVC) Simulation environments are increasingly being examined for potential analytical use particularly in test and evaluation. The LVC simulation environments provide a mechanism for conducting joint mission testing and system of systems testing when scale and resource limitations prevent the accumulation of the necessary density and diversity of assets required for these complex and comprehensive tests. The statistical experimental design process is re-examined for potential application to LVC experiments and several additional considerations are identified to augment the experimental design process for use with LVC. This augmented statistical experimental design process is demonstrated by a case study involving a series of tests on an experimental data link for strike aircraft using LVC simulation for the test environment. The goal of these tests is to assess the usefulness of information being presented to aircrew members via different datalink capabilities. The statistical experimental design process is used to structure the experiment leading to the discovery of faulty assumptions and planning mistakes that could potentially wreck the results of the experiment. Lastly, an aggressive sequential experimentation strategy is presented for LVC experiments when test resources are limited. This strategy depends on a foldover algorithm that we developed for nearly orthogonal arrays to rescue LVC experiments when important factor effects are confounded

    Computer Aided Product Design and Development for Peroxide Based Disinfectants

    Get PDF
    Disinfectants are antimicrobial chemicals that are commonly used in health care facilities to prevent or reduce the spread of pathogenic microorganisms. These products are under national regulations for the claims they make and have to be tested for their microbial activity against different microorganisms. They have to also be tested for product stability, corrosion and toxicity. These tests, especially the microbial efficacy tests, are very expensive and take a long time to perform (anywhere from two days to four months). Disinfectant formulations have to have a balance between their microbial activity, corrosivity, and safety. The more active ingredients in the formulation, the stronger the product, but the higher the corrosivity and toxicity. Therefore, it is desirable to use as low concentrations of ingredients as possible in the formulation to achieve the acceptable antimicrobial activity. The final product has to also be chemically and physically stable for at least one year. Consequently, the product development process takes at least six months and sometimes even up to two years. The cost might also reach hundreds of thousands of dollars. The objective of this project was to design a systematic way to take advantage of the historical data, augment them with some experimental trials, perform a regression analysis using the best possible methods available such as least squares or neural networks, invert the models, and finally use optimization techniques to develop the new products in the shortest period of time. The formulation predicted by this model will be much closer to the final formulation resulting in significant reductions in time and cost of the product development process. Furthermore, the model can be updated with the newly generated data to improve its predictive capability. Lastly, the disinfectant formulation can be viewed as a case study for a broader problem, formulation product design, and can be implemented in similar cases where the formulation of a new product should pass certain interfering criteria, such as adhesives, pharmaceutical drugs, agriculture pesticides, detergents, etc
    corecore