8,550 research outputs found

    Aerodynamic shape optimization of a low drag fairing for small livestock trailers

    Get PDF
    Small livestock trailers are commonly used to transport animals from farms to market within the United Kingdom. Due to the bluff nature of these vehicles there is great potential for reducing drag with a simple add-on fairing. This paper explores the feasibility of combining high-fidelity aerodynamic analysis, accurate metamodeling, and efficient optimization techniques to find an optimum fairing geometry which reduces drag, without significantly impairing internal ventilation. Airflow simulations were carried out using Computational Fluid Dynamics (CFD) to assess the performance of each fairing based on three design variables. A Moving Least Squares (MLS) metamodel was built on a fifty-point Optimal Latin Hypercube (OLH) Design of Experiments (DoE), where each point represented a different geometry configuration. Traditional optimization techniques were employed on the metamodel until an optimum geometrical configuration was found. This optimum design was tested using CFD and it matched closely to the metamodel prediction. Further, the drag reduction was measured at 14.4% on the trailer and 6.6% for the combined truck and trailer

    Metamodel variability analysis combining bootstrapping and validation techniques

    Get PDF
    Research on metamodel-based optimization has received considerably increasing interest in recent years, and has found successful applications in solving computationally expensive problems. The joint use of computer simulation experiments and metamodels introduces a source of uncertainty that we refer to as metamodel variability. To analyze and quantify this variability, we apply bootstrapping to residuals derived as prediction errors computed from cross-validation. The proposed method can be used with different types of metamodels, especially when limited knowledge on parameters’ distribution is available or when a limited computational budget is allowed. Our preliminary experiments based on the robust version of the EOQ model show encouraging results

    Efficient Robust Optimization of Metal Forming Processes using a Sequential Metamodel Based Strategy

    Get PDF
    The coupling of Finite Element (FE) simulations to mathematical optimization techniques has contributed significantly to product improvements and cost reductions in the metal forming industries. The next challenge is to bridge the gap between deterministic optimization techniques and the industrial need for robustness. This paper introduces a new and generally applicable structured methodology for modeling and solving robust optimization problems. Stochastic design variables or noise variables are taken into account explicitly in the optimization procedure. The metamodel-based strategy is combined with a sequential improvement algorithm to efficiently increase the accuracy of the objective function prediction. This is only done at regions of interest containing the optimal robust design. Application of the methodology to an industrial V-bending process resulted in valuable process insights and an improved robust process design. Moreover, a significant improvement of the robustness (> 2s ) was obtained by minimizing the deteriorating effects of several noise variables. The robust optimization results demonstrate the general applicability of the robust optimization strategy and underline the importance of including uncertainty and robustness explicitly in the numerical optimization procedure

    Automatically Discovering Hidden Transformation Chaining Constraints

    Get PDF
    Model transformations operate on models conforming to precisely defined metamodels. Consequently, it often seems relatively easy to chain them: the output of a transformation may be given as input to a second one if metamodels match. However, this simple rule has some obvious limitations. For instance, a transformation may only use a subset of a metamodel. Therefore, chaining transformations appropriately requires more information. We present here an approach that automatically discovers more detailed information about actual chaining constraints by statically analyzing transformations. The objective is to provide developers who decide to chain transformations with more data on which to base their choices. This approach has been successfully applied to the case of a library of endogenous transformations. They all have the same source and target metamodel but have some hidden chaining constraints. In such a case, the simple metamodel matching rule given above does not provide any useful information

    A trajectory-based sampling strategy for sequentially refined metamodel management of metamodel-based dynamic optimization in mechatronics

    No full text
    Dynamic optimization problems based on computationally expensive models that embody the dynamics of a mechatronic system can result in prohibitively long optimization runs. When facing optimization problems with static models, reduction in the computational time and thus attaining convergence can be established by means of a metamodel placed within a metamodel management scheme. This paper proposes a metamodel management scheme with a dedicated sampling strategy when using computationally demanding dynamic models in a dynamic optimization problem context. The dedicated sampling strategy enables to attain dynamically feasible solutions where the metamodel is locally refined during the optimization process upon satisfying a feasibility-based stopping condition. The samples are distributed along the iterate trajectories of the sequential direct dynamic optimization procedure. Algorithmic implementation of the trajectory-based metamodel management is detailed and applied on two case studies involving dynamic optimization problems. These numerical experiments illustrate the benefits of the presented scheme and its sampling strategy on the convergence properties. It is shown that the acceleration of the solution time of the dynamic optimization problem can be achieved when evaluating the metamodel that is lower than 90% compared to the computationally expensive model

    Reliability-based design optimization of shells with uncertain geometry using adaptive Kriging metamodels

    Full text link
    Optimal design under uncertainty has gained much attention in the past ten years due to the ever increasing need for manufacturers to build robust systems at the lowest cost. Reliability-based design optimization (RBDO) allows the analyst to minimize some cost function while ensuring some minimal performances cast as admissible failure probabilities for a set of performance functions. In order to address real-world engineering problems in which the performance is assessed through computational models (e.g., finite element models in structural mechanics) metamodeling techniques have been developed in the past decade. This paper introduces adaptive Kriging surrogate models to solve the RBDO problem. The latter is cast in an augmented space that "sums up" the range of the design space and the aleatory uncertainty in the design parameters and the environmental conditions. The surrogate model is used (i) for evaluating robust estimates of the failure probabilities (and for enhancing the computational experimental design by adaptive sampling) in order to achieve the requested accuracy and (ii) for applying a gradient-based optimization algorithm to get optimal values of the design parameters. The approach is applied to the optimal design of ring-stiffened cylindrical shells used in submarine engineering under uncertain geometric imperfections. For this application the performance of the structure is related to buckling which is addressed here by means of a finite element solution based on the asymptotic numerical method

    Metamodel-based importance sampling for structural reliability analysis

    Full text link
    Structural reliability methods aim at computing the probability of failure of systems with respect to some prescribed performance functions. In modern engineering such functions usually resort to running an expensive-to-evaluate computational model (e.g. a finite element model). In this respect simulation methods, which may require 103610^{3-6} runs cannot be used directly. Surrogate models such as quadratic response surfaces, polynomial chaos expansions or kriging (which are built from a limited number of runs of the original model) are then introduced as a substitute of the original model to cope with the computational cost. In practice it is almost impossible to quantify the error made by this substitution though. In this paper we propose to use a kriging surrogate of the performance function as a means to build a quasi-optimal importance sampling density. The probability of failure is eventually obtained as the product of an augmented probability computed by substituting the meta-model for the original performance function and a correction term which ensures that there is no bias in the estimation even if the meta-model is not fully accurate. The approach is applied to analytical and finite element reliability problems and proves efficient up to 100 random variables.Comment: 20 pages, 7 figures, 2 tables. Preprint submitted to Probabilistic Engineering Mechanic
    corecore