13,981 research outputs found
Distributed multilevel optimization for complex structures
Optimization problems concerning complex structures with many design variables may entail an unacceptable computational cost. This problem can be reduced considerably with a multilevel approach: A structure consisting of several components is optimized as a whole (global) as well as on the component level. In this paper, an optimization method is discussed with applications in the assessment of the impact of new design considerations in the development of a structure. A strategy based on fully stressed design is applied for optimization problems in linear statics. A global model is used to calculate the interactions (e.g., loads) for each of the components. These components are then optimized using the prescribed interactions, followed by a new global calculation to update the interactions. Mixed discrete and continuous design variables as well as different design configurations are possible. An application of this strategy is presented in the form of the full optimization of a vertical tail plane center box of a generic large passenger aircraft. In linear dynamics, the parametrization of the component interactions is problematic due to the frequency dependence. Hence, a modified method is presented in which the speed of component mode synthesis is used to avoid this parametrization. This method is applied to a simple test case that originates from noise control. \u
Experimental Design for Sensitivity Analysis, Optimization and Validation of Simulation Models
This chapter gives a survey on the use of statistical designs for what-if analysis in simula- tion, including sensitivity analysis, optimization, and validation/verification. Sensitivity analysis is divided into two phases. The first phase is a pilot stage, which consists of screening or searching for the important factors among (say) hundreds of potentially important factors. A novel screening technique is presented, namely sequential bifurcation. The second phase uses regression analysis to approximate the input/output transformation that is implied by the simulation model; the resulting regression model is also known as a metamodel or a response surface. Regression analysis gives better results when the simu- lation experiment is well designed, using either classical statistical designs (such as frac- tional factorials) or optimal designs (such as pioneered by Fedorov, Kiefer, and Wolfo- witz). To optimize the simulated system, the analysts may apply Response Surface Metho- dology (RSM); RSM combines regression analysis, statistical designs, and steepest-ascent hill-climbing. To validate a simulation model, again regression analysis and statistical designs may be applied. Several numerical examples and case-studies illustrate how statisti- cal techniques can reduce the ad hoc character of simulation; that is, these statistical techniques can make simulation studies give more general results, in less time. Appendix 1 summarizes confidence intervals for expected values, proportions, and quantiles, in termi- nating and steady-state simulations. Appendix 2 gives details on four variance reduction techniques, namely common pseudorandom numbers, antithetic numbers, control variates or regression sampling, and importance sampling. Appendix 3 describes jackknifing, which may give robust confidence intervals.least squares;distribution-free;non-parametric;stopping rule;run-length;Von Neumann;median;seed;likelihood ratio
Augmenting Definitive Screening Designs
Design of experiments is used to study the relationship between one or more response variables and several factors whose levels are varied. Response surface methodology (RSM) employs the design of experiment techniques to decide if changes in design variables can enhance or optimize a process. They are usually analyzed by fitting a second-order polynomial model. Some standard and classical response surface designs are Factorial Designs, Central Composite Designs (CCDs), and Box-Behnken Designs (BBDs). They can all be used to fit a second-order polynomial model efficiently and allow for some testing of the model\u27s lack of fit. When performing multiple experiments is not feasible due to time, budget, or other constraints, recent literature suggests using a single experimental design capable of performing both factor screening and surface response exploration. Definitive Screening Designs (DSDs) are well-known experimental designs with three levels. They are also named second-order screening designs, and they can estimate a second-order model in any subsets of three factors. However, when the design has more than three active factors, only the linear main effects and perhaps the largest second-order term can be identified by a DSD. Also, they may have trouble identifying active pure quadratic effects when two-factor interactions are present. In this dissertation, We propose several methods for augmenting definitive screening designs for improving estimability and efficiency. Improved sensitivity and specificity are also highlighted
Speeding up neighborhood search in local Gaussian process prediction
Recent implementations of local approximate Gaussian process models have
pushed computational boundaries for non-linear, non-parametric prediction
problems, particularly when deployed as emulators for computer experiments.
Their flavor of spatially independent computation accommodates massive
parallelization, meaning that they can handle designs two or more orders of
magnitude larger than previously. However, accomplishing that feat can still
require massive supercomputing resources. Here we aim to ease that burden. We
study how predictive variance is reduced as local designs are built up for
prediction. We then observe how the exhaustive and discrete nature of an
important search subroutine involved in building such local designs may be
overly conservative. Rather, we suggest that searching the space radially,
i.e., continuously along rays emanating from the predictive location of
interest, is a far thriftier alternative. Our empirical work demonstrates that
ray-based search yields predictors with accuracy comparable to exhaustive
search, but in a fraction of the time - bringing a supercomputer implementation
back onto the desktop.Comment: 24 pages, 5 figures, 4 table
Regression Models and Experimental Designs: A Tutorial for Simulation Analaysts
This tutorial explains the basics of linear regression models. especially low-order polynomials. and the corresponding statistical designs. namely, designs of resolution III, IV, V, and Central Composite Designs (CCDs).This tutorial assumes 'white noise', which means that the residuals of the fitted linear regression model are normally, independently, and identically distributed with zero mean.The tutorial gathers statistical results that are scattered throughout the literature on mathematical statistics, and presents these results in a form that is understandable to simulation analysts.metamodels;fractional factorial designs;Plackett-Burman designs;factor interactions;validation;cross-validation
- …