3,438 research outputs found

    Simulation-Optimization via Kriging and Bootstrapping:A Survey (Revision of CentER DP 2011-064)

    Get PDF
    Abstract: This article surveys optimization of simulated systems. The simulation may be either deterministic or random. The survey reflects the author’s extensive experience with simulation-optimization through Kriging (or Gaussian process) metamodels. The analysis of these metamodels may use parametric bootstrapping for deterministic simulation or distribution-free bootstrapping (or resampling) for random simulation. The survey covers: (1) Simulation-optimization through "efficient global optimization" (EGO) using "expected improvement" (EI); this EI uses the Kriging predictor variance, which can be estimated through parametric bootstrapping accounting for estimation of the Kriging parameters. (2) Optimization with constraints for multiple random simulation outputs and deterministic inputs through mathematical programming applied to Kriging metamodels validated through distribution-free bootstrapping. (3) Taguchian robust optimization for uncertain environments, using mathematical programming— applied to Kriging metamodels— and distribution- free bootstrapping to estimate the variability of the Kriging metamodels and the resulting robust solution. (4) Bootstrapping for improving convexity or preserving monotonicity of the Kriging metamodel.

    Metamodel variability analysis combining bootstrapping and validation techniques

    Get PDF
    Research on metamodel-based optimization has received considerably increasing interest in recent years, and has found successful applications in solving computationally expensive problems. The joint use of computer simulation experiments and metamodels introduces a source of uncertainty that we refer to as metamodel variability. To analyze and quantify this variability, we apply bootstrapping to residuals derived as prediction errors computed from cross-validation. The proposed method can be used with different types of metamodels, especially when limited knowledge on parameters’ distribution is available or when a limited computational budget is allowed. Our preliminary experiments based on the robust version of the EOQ model show encouraging results

    Mechanical MNIST: A benchmark dataset for mechanical metamodels

    Full text link
    Metamodels, or models of models, map defined model inputs to defined model outputs. Typically, metamodels are constructed by generating a dataset through sampling a direct model and training a machine learning algorithm to predict a limited number of model outputs from varying model inputs. When metamodels are constructed to be computationally cheap, they are an invaluable tool for applications ranging from topology optimization, to uncertainty quantification, to multi-scale simulation. By nature, a given metamodel will be tailored to a specific dataset. However, the most pragmatic metamodel type and structure will often be general to larger classes of problems. At present, the most pragmatic metamodel selection for dealing with mechanical data has not been thoroughly explored. Drawing inspiration from the benchmark datasets available to the computer vision research community, we introduce a benchmark data set (Mechanical MNIST) for constructing metamodels of heterogeneous material undergoing large deformation. We then show examples of how our benchmark dataset can be used, and establish baseline metamodel performance. Because our dataset is readily available, it will enable the direct quantitative comparison between different metamodeling approaches in a pragmatic manner. We anticipate that it will enable the broader community of researchers to develop improved metamodeling techniques for mechanical data that will surpass the baseline performance that we show here.Accepted manuscrip

    Sensitivity analysis of expensive black-box systems using metamodeling

    Get PDF
    Simulations are becoming ever more common as a tool for designing complex products. Sensitivity analysis techniques can be applied to these simulations to gain insight, or to reduce the complexity of the problem at hand. However, these simulators are often expensive to evaluate and sensitivity analysis typically requires a large amount of evaluations. Metamodeling has been successfully applied in the past to reduce the amount of required evaluations for design tasks such as optimization and design space exploration. In this paper, we propose a novel sensitivity analysis algorithm for variance and derivative based indices using sequential sampling and metamodeling. Several stopping criteria are proposed and investigated to keep the total number of evaluations minimal. The results show that both variance and derivative based techniques can be accurately computed with a minimal amount of evaluations using fast metamodels and FLOLA-Voronoi or density sequential sampling algorithms.Comment: proceedings of winter simulation conference 201

    A metamodel based optimisation algorithm for metal forming processes

    Get PDF
    Cost saving and product improvement have always been important goals in the metal\ud forming industry. To achieve these goals, metal forming processes need to be optimised. During\ud the last decades, simulation software based on the Finite Element Method (FEM) has significantly\ud contributed to designing feasible processes more easily. More recently, the possibility of\ud coupling FEM to mathematical optimisation algorithms is offering a very promising opportunity\ud to design optimal metal forming processes instead of only feasible ones. However, which\ud optimisation algorithm to use is still not clear.\ud In this paper, an optimisation algorithm based on metamodelling techniques is proposed\ud for optimising metal forming processes. The algorithm incorporates nonlinear FEM simulations\ud which can be very time consuming to execute. As an illustration of its capabilities, the\ud proposed algorithm is applied to optimise the internal pressure and axial feeding load paths\ud of a hydroforming process. The product formed by the optimised process outperforms products\ud produced by other, arbitrarily selected load paths. These results indicate the high potential of\ud the proposed algorithm for optimising metal forming processes using time consuming FEM\ud simulations

    Efficient Robust Optimization of Metal Forming Processes using a Sequential Metamodel Based Strategy

    Get PDF
    The coupling of Finite Element (FE) simulations to mathematical optimization techniques has contributed significantly to product improvements and cost reductions in the metal forming industries. The next challenge is to bridge the gap between deterministic optimization techniques and the industrial need for robustness. This paper introduces a new and generally applicable structured methodology for modeling and solving robust optimization problems. Stochastic design variables or noise variables are taken into account explicitly in the optimization procedure. The metamodel-based strategy is combined with a sequential improvement algorithm to efficiently increase the accuracy of the objective function prediction. This is only done at regions of interest containing the optimal robust design. Application of the methodology to an industrial V-bending process resulted in valuable process insights and an improved robust process design. Moreover, a significant improvement of the robustness (> 2s ) was obtained by minimizing the deteriorating effects of several noise variables. The robust optimization results demonstrate the general applicability of the robust optimization strategy and underline the importance of including uncertainty and robustness explicitly in the numerical optimization procedure

    Coordination of Coupled Black Box Simulations in the Construction of Metamodels

    Get PDF
    This paper introduces methods to coordinate black box simulations in the construction of metamodels for situations in which we have to deal with coupled black boxes.We de.ne three coordination methods: parallel simulation, sequential simulation and sequential modeling.To compare these three methods we focus on .ve aspects: throughput time, .exibility, simulated product designs, coordination complexityand the use of prior information.Special attention is given to the throughput time aspect.For this aspect we derive mathematical formulas and we give relations between the throughput times of the three coordination methods.At the end of this paper we summarize the results and give recommendations on the choice of a suitable coordination method.simulation;simulation models;coordination;black box;metamodels
    • …
    corecore