33 research outputs found

    APPROXIMATION ASSISTED MULTIOBJECTIVE AND COLLABORATIVE ROBUST OPTIMIZATION UNDER INTERVAL UNCERTAINTY

    Get PDF
    Optimization of engineering systems under uncertainty often involves problems that have multiple objectives, constraints and subsystems. The main goal in these problems is to obtain solutions that are optimum and relatively insensitive to uncertainty. Such solutions are called robust optimum solutions. Two classes of such problems are considered in this dissertation. The first class involves Multi-Objective Robust Optimization (MORO) problems under interval uncertainty. In this class, an entire system optimization problem, which has multiple nonlinear objectives and constraints, is solved by a multiobjective optimizer at one level while robustness of trial alternatives generated by the optimizer is evaluated at the other level. This bi-level (or nested) MORO approach can become computationally prohibitive as the size of the problem grows. To address this difficulty, a new and improved MORO approach under interval uncertainty is developed. Unlike the previously reported bi-level MORO methods, the improved MORO performs robustness evaluation only for optimum solutions and uses this information to iteratively shrink the feasible domain and find the location of robust optimum solutions. Compared to the previous bi-level approach, the improved MORO significantly reduces the number of function calls needed to arrive at the solutions. To further improve the computational cost, the improved MORO is combined with an online approximation approach. This new approach is called Approximation-Assisted MORO or AA-MORO. The second class involves Multiobjective collaborative Robust Optimization (McRO) problems. In this class, an entire system optimization problem is decomposed hierarchically along user-defined domain specific boundaries into system optimization problem and several subsystem optimization subproblems. The dissertation presents a new Approximation-Assisted McRO (AA-McRO) approach under interval uncertainty. AA-McRO uses a single-objective optimization problem to coordinate all system and subsystem optimization problems in a Collaborative Optimization (CO) framework. The approach converts the consistency constraints of CO into penalty terms which are integrated into the subsystem objective functions. In this way, AA-McRO is able to explore the design space and obtain optimum design solutions more efficiently compared to a previously reported McRO. Both AA-MORO and AA-McRO approaches are demonstrated with a variety of numerical and engineering optimization examples. It is found that the solutions from both approaches compare well with the previously reported approaches but require a significantly less computational cost. Finally, the AA-MORO has been used in the development of a decision support system for a refinery case study in order to facilitate the integration of engineering and business decisions using an agent-based approach

    Robust Optimization and Sensitivity Analysis with Multi-Objective Genetic Algorithms: Single- and Multi-Disciplinary Applications

    Get PDF
    Uncertainty is inevitable in engineering design optimization and can significantly degrade the performance of an optimized design solution and/or even change feasibility by making a feasible solution infeasible. The problem with uncertainty can be exacerbated in multi-disciplinary optimization whereby the models for several disciplines are coupled and the propagation of uncertainty has to be accounted for within and across disciplines. It is important to determine which ranges of parameter uncertainty are most important or how to best allocate investments to partially or fully reduce uncertainty under a limited budget. To address these issues, this dissertation concentrates on a new robust optimization approach and a new sensitivity analysis approach for multi-objective and multi-disciplinary design optimization problems that have parameters with interval uncertainty. The dissertation presents models and approaches under four research thrusts. In the first thrust, an approach is presented to obtain robustly optimal solutions which are as best as possible, in a multi-objective sense, and at the same time their sensitivity of objective and/or constraint functions is within an acceptable range. In the second thrust, the robust optimization approach in the first thrust is extended to design optimization problems which are decomposed into multiple subproblems, each with multiple objectives and constraints. In the third thrust, a new approach for multi-objective sensitivity analysis and uncertainty reduction is presented. And in the final research thrust, a metamodel embedded Multi-Objective Genetic Algorithm (MOGA) for solution of design optimization problems is presented. Numerous numerical and engineering examples are used to explore and demonstrate the applicability and performance of the robust optimization, sensitivity analysis and MOGA techniques developed in this dissertation. It is shown that the obtained robust optimal solutions for the test examples are conservative compared to their corresponding optimal solutions in the deterministic case. For the sensitivity analysis, it is demonstrated that the proposed method identifies parameters whose uncertainty reduction or elimination produces the largest payoffs for any given investment. Finally, it is shown that the new MOGA requires a significantly fewer number of simulation calls, when used to solve multi-objective design optimization problems, compared to previously developed MOGA methods while obtaining comparable solutions

    Sensitivity Analysis Based Approaches for Mitigating the Effects of Reducible Interval Input Uncertainty on Single- and Multi-Disciplinary Systems using Multi-Objective Optimization

    Get PDF
    Uncertainty is an unavoidable aspect of engineering systems and will often degrade system performance or perhaps even lead to system failure. As a result, uncertainty must be considered as a part of the design process for all real-world engineering systems. The presence of reducible uncertainty further complicates matters as designers must not only account for the degrading effects of uncertainty but must also determine what levels of uncertainty can be considered as acceptable. For these reasons, methods for determining and effectively mitigating the effects of uncertainty are necessary for solving engineering design problems. This dissertation presents several new methods for use in the design of engineering systems under interval input uncertainty. These new approaches were developed over the course of four interrelated research thrusts and focused on the overall goal of extending the current research in the area of sensitivity analysis based design under reducible interval uncertainty. The first research thrust focused on developing an approach for determining optimal uncertainty reductions given multi-disciplinary engineering systems with multiple output functions at both the system and sub-system levels. The second research thrust extended the approach developed during the first thrust to use uncertainty reduction as a means for both reducing output variations and simultaneously ensuring engineering feasibility. The third research thrust looked at systems where uncertainty reduction alone is insufficient for ensuring feasibility and thus developed a sensitivity analysis approach that combined uncertainty reductions with small design adjustments in an effort to again reduce output variations and ensure feasibility. The fourth and final research thrust looked to relax many of the assumptions required by the first three research thrusts and developed a general sensitivity analysis inspired approach for determining optimal upper and lower bounds for reducible sources of input uncertainty. Multi-objective optimization techniques were used throughout this research to evaluate the tradeoffs between the benefits to be gained by mitigating uncertainty with the costs of making the design changes and/or uncertainty reductions required to reduce or eliminate the degrading effects of system uncertainty most effectively. The validity of the approaches developed were demonstrated using numerical and engineering example problems of varying complexity

    Decoupled UMDO formulation for interdisciplinary coupling satisfaction under uncertainty

    Get PDF
    International audienceAt early design phases, taking into account uncertainty for the optimization of a multidisciplinary system is essential to establish the optimal system characteristics and performances. Uncertainty Multidisciplinary Design Optimization (UMDO) formulations have to eciently organize the dierent disciplinary analyses, the uncertainty propagation, the optimization, but also the handling of interdisciplinary couplings under uncertainty. A decoupled UMDO formulation (Individual Discipline Feasible - Polynomial Chaos Expansion) ensuring the coupling satisfaction for all the instantiations of the uncertain variables is presented in this paper. Ensuring coupling satisfaction in instantiations is essential to ensure the equivalence between the coupled and decoupled UMDO problem formulations. The proposed approach relies on the iterative construction of surrogate models based on Polynomial Chaos Expansion in order to represent at the convergence of the optimization problem, the coupling functional relations as a coupled approach under uncertainty does. The performances of the proposed formulation is assessed on an analytic test case and on the design of a new Vega launch vehicle upper stage

    Conceptual multidisciplinary design via a multi-objective multi-fidelity optimisation method.

    Get PDF
    Air travel demand and the associated fuel emissions are expected to keep increasing in the following decades, forcing the aerospace industry to find ways to revolutionise the design process to achieve step-like performance improvements and emission reduction goals. A promising approach towards that goal is multidisciplinary design. To maximise the benefits, interdisciplinary synergies have to be investigated early in the design process. Efficient multidisciplinary optimisation tools are required to reliably identify a set of promising design directions to support engineering decision making towards the new generation of aircraft. To support these needs, a novel optimisation methodology is proposed aiming in exploiting multidisciplinary trends in the conceptual stage, exploring the design space and providing a pareto set of optimum configurations in the minimum cost possible. This is achieved by a combination of the expected improvement surrogate based optimisation plan, a novel Kriging modification to allow the use of multi-fidelity tools and a multi-objective sub-optimisation process infill formulation implemented within an multidisciplinary design optimisation architecture. A series of analytical test cases were initially used to develop the methodology and examine its performance under a set of criteria like global optimality, computational efficiency and dimensionality scaling. These were followed by two industrially relevant aerodynamic design cases, the RAE2822 transonic airfoil and the GARTEUR high lift configuration, investigating the effect of the constraint handling methods and the low fidelity tool. The cost reductions and exploration characteristics achieved by the method were quantified in realistic unconstrained, constrained and multi-objective problems. Finally, an aerostructural optimisation study of the NASA Common Research Model was used as a representative of a complex multidisciplinary design problem. The results demonstrate the framework’s capabilities in industrial problems, showing improved results and design space exploration but with lower costs than similarly oriented methods. The effect of the multidisciplinary architecture was also examined

    Understanding Complexity in Multiobjective Optimization

    Get PDF
    This report documents the program and outcomes of the Dagstuhl Seminar 15031 Understanding Complexity in Multiobjective Optimization. This seminar carried on the series of four previous Dagstuhl Seminars (04461, 06501, 09041 and 12041) that were focused on Multiobjective Optimization, and strengthening the links between the Evolutionary Multiobjective Optimization (EMO) and Multiple Criteria Decision Making (MCDM) communities. The purpose of the seminar was to bring together researchers from the two communities to take part in a wide-ranging discussion about the different sources and impacts of complexity in multiobjective optimization. The outcome was a clarified viewpoint of complexity in the various facets of multiobjective optimization, leading to several research initiatives with innovative approaches for coping with complexity

    The Meta-Model Approach for Simulation-based Design Optimization.

    Get PDF
    The design of products and processes makes increasing use of computer simulations for the prediction of its performance. These computer simulations are considerably cheaper than their physical equivalent. Finding the optimal design has therefore become a possibility. One approach for finding the optimal design using computer simulations is the meta-model approach, which approximates the behaviour of the computer simulation outcome using a limited number of time-consuming computer simulations. This thesis contains four main contributions, which are illustrated by industrial cases. First, a method is presented for the construction of an experimental design for computer simulations when the design space is restricted by many (nonlinear) constraints. The second contribution is a new approach for the approximation of the simulation outcome. This approximation method is particularly useful when the simulation model outcome reacts highly nonlinear to its inputs. Third, the meta-model based approach is extended to a robust optimization framework. Using this framework, many uncertainties can be taken into account, including uncertainty on the simulation model outcome. The fourth main contribution is the extension of the approach for use in integral design of many parts of complex systems.

    Reduced Order Techniques for Sensitivity Analysis and Design Optimization of Aerospace Systems

    Get PDF
    This work proposes a new method for using reduced order models in lieu of high fidelity analysis during the sensitivity analysis step of gradient based design optimization. The method offers a reduction in the computational cost of finite difference based sensitivity analysis in that context. The method relies on interpolating reduced order models which are based on proper orthogonal decomposition. The interpolation process is performed using radial basis functions and Grassmann manifold projection. It does not require additional high fidelity analyses to interpolate a reduced order model for new points in the design space. The interpolated models are used specifically for points in the finite difference stencil during sensitivity analysis. The proposed method is applied to an airfoil shape optimization (ASO) problem and a transport wing optimization (TWO) problem. The errors associated with the reduced order models themselves as well as the gradients calculated from them are evaluated. The effects of the method on the overall optimization path, computation times, and function counts are also examined. The ASO results indicate that the proposed scheme is a viable method for reducing the computational cost of these optimizations. They also indicate that the adaptive step is an effective method of improving interpolated gradient accuracy. The TWO results indicate that the interpolation accuracy can have a strong impact on optimization search direction

    A Data Mining Methodology for Vehicle Crashworthiness Design

    Get PDF
    This study develops a systematic design methodology based on data mining theory for decision-making in the development of crashworthy vehicles. The new data mining methodology allows the exploration of a large crash simulation dataset to discover the underlying relationships among vehicle crash responses and design variables at multiple levels and to derive design rules based on the whole-vehicle safety requirements to make decisions about component-level and subcomponent-level design. The method can resolve a major issue with existing design approaches related to vehicle crashworthiness: that is, limited abilities to explore information from large datasets, which may hamper decision-making in the design processes. At the component level, two structural design approaches were implemented for detailed component design with the data mining method: namely, a dimension-based approach and a node-based approach to handle structures with regular and irregular shapes, respectively. These two approaches were used to design a thin-walled vehicular structure, the S-shaped beam, against crash loading. A large number of design alternatives were created, and their responses under loading were evaluated by finite element simulations. The design variables and computed responses formed a large design dataset. This dataset was then mined to build a decision tree. Based on the decision tree, the interrelationships among the design parameters were revealed, and design rules were generated to produce a set of good designs. After the data mining, the critical design parameters were identified and the design space was reduced, which can simplify the design process. To partially replace the expensive finite element simulations, a surrogate model was used to model the relationships between design variables and response. Four machine learning algorithms, which can be used for surrogate model development, were compared. Based on the results, Gaussian process regression was determined to be the most suitable technique in the present scenario, and an optimization process was developed to tune the algorithm’s hyperparameters, which govern the model structure and training process. To account for engineering uncertainty in the data mining method, a new decision tree for uncertain data was proposed based on the joint probability in uncertain spaces, and it was implemented to again design the S-beam structure. The findings show that the new decision tree can produce effective decision-making rules for engineering design under uncertainty. To evaluate the new approaches developed in this work, a comprehensive case study was conducted by designing a vehicle system against the frontal crash. A publicly available vehicle model was simplified and validated. Using the newly developed approaches, new component designs in this vehicle were generated and integrated back into the vehicle model so their crash behavior could be simulated. Based on the simulation results, one can conclude that the designs with the new method can outperform the original design in terms of measures of mass, intrusion and peak acceleration. Therefore, the performance of the new design methodology has been confirmed. The current study demonstrates that the new data mining method can be used in vehicle crashworthiness design, and it has the potential to be applied to other complex engineering systems with a large amount of design data

    Meta-Modelling of Intensive Computational Models

    Get PDF
    Engineering process design for applications that use computationally intensive nonlinear dynamical systems can be expensive in time and resources. The presented work reviews the concept of a meta-model as a way to improve the efficiency of this process. The proposed meta-model will have a computational advantage in implementation over the computationally intensive model therefore reducing the time and resources required to design an engineering process. This work proposes to meta-model a computationally intensive nonlinear dynamical system using reduced-order linear parameter varying system modelling approach with local linear models in velocity based linearization form. The parameters of the linear time-varying meta-model are blended using Gaussian Processes regression models. The meta-model structure is transparent and relates directly to the dynamics of the computationally intensive model while the velocity-based local linear models faithfully reproduce the original system dynamics anywhere in the operating space of the system. The non-parametric blending of the meta-model local linear models by Gaussian Processes regression models is ideal to deal with data sparsity and will provide uncertainty information about the meta-model predictions. The proposed meta-model structure has been applied to second-order nonlinear dynamical systems, a small sized nonlinear transmission line model, medium sized fluid dynamics problem and the computationally intensive nonlinear transmission line model of order 5000
    corecore