10,075 research outputs found

    Sensitivity Analysis Based Approaches for Mitigating the Effects of Reducible Interval Input Uncertainty on Single- and Multi-Disciplinary Systems using Multi-Objective Optimization

    Get PDF
    Uncertainty is an unavoidable aspect of engineering systems and will often degrade system performance or perhaps even lead to system failure. As a result, uncertainty must be considered as a part of the design process for all real-world engineering systems. The presence of reducible uncertainty further complicates matters as designers must not only account for the degrading effects of uncertainty but must also determine what levels of uncertainty can be considered as acceptable. For these reasons, methods for determining and effectively mitigating the effects of uncertainty are necessary for solving engineering design problems. This dissertation presents several new methods for use in the design of engineering systems under interval input uncertainty. These new approaches were developed over the course of four interrelated research thrusts and focused on the overall goal of extending the current research in the area of sensitivity analysis based design under reducible interval uncertainty. The first research thrust focused on developing an approach for determining optimal uncertainty reductions given multi-disciplinary engineering systems with multiple output functions at both the system and sub-system levels. The second research thrust extended the approach developed during the first thrust to use uncertainty reduction as a means for both reducing output variations and simultaneously ensuring engineering feasibility. The third research thrust looked at systems where uncertainty reduction alone is insufficient for ensuring feasibility and thus developed a sensitivity analysis approach that combined uncertainty reductions with small design adjustments in an effort to again reduce output variations and ensure feasibility. The fourth and final research thrust looked to relax many of the assumptions required by the first three research thrusts and developed a general sensitivity analysis inspired approach for determining optimal upper and lower bounds for reducible sources of input uncertainty. Multi-objective optimization techniques were used throughout this research to evaluate the tradeoffs between the benefits to be gained by mitigating uncertainty with the costs of making the design changes and/or uncertainty reductions required to reduce or eliminate the degrading effects of system uncertainty most effectively. The validity of the approaches developed were demonstrated using numerical and engineering example problems of varying complexity

    Comparison of data-driven uncertainty quantification methods for a carbon dioxide storage benchmark scenario

    Full text link
    A variety of methods is available to quantify uncertainties arising with\-in the modeling of flow and transport in carbon dioxide storage, but there is a lack of thorough comparisons. Usually, raw data from such storage sites can hardly be described by theoretical statistical distributions since only very limited data is available. Hence, exact information on distribution shapes for all uncertain parameters is very rare in realistic applications. We discuss and compare four different methods tested for data-driven uncertainty quantification based on a benchmark scenario of carbon dioxide storage. In the benchmark, for which we provide data and code, carbon dioxide is injected into a saline aquifer modeled by the nonlinear capillarity-free fractional flow formulation for two incompressible fluid phases, namely carbon dioxide and brine. To cover different aspects of uncertainty quantification, we incorporate various sources of uncertainty such as uncertainty of boundary conditions, of conceptual model definitions and of material properties. We consider recent versions of the following non-intrusive and intrusive uncertainty quantification methods: arbitary polynomial chaos, spatially adaptive sparse grids, kernel-based greedy interpolation and hybrid stochastic Galerkin. The performance of each approach is demonstrated assessing expectation value and standard deviation of the carbon dioxide saturation against a reference statistic based on Monte Carlo sampling. We compare the convergence of all methods reporting on accuracy with respect to the number of model runs and resolution. Finally we offer suggestions about the methods' advantages and disadvantages that can guide the modeler for uncertainty quantification in carbon dioxide storage and beyond

    Enablers for uncertainty quantification and management in early stage computational design. An aircraft perspective

    Get PDF
    Presented in this thesis are novel methods for uncertainty quantification and management (UQ&M) in computational engineering design. The research has been motivated by the industrial need for improved UQ&M techniques, particularly in response to the rapid development of the model-based approach and its application to the (early) design of complex products such as aircraft. Existing work has already addressed a number of theoretical and computational challenges, especially regarding uncertainty propagation. In this research, the contributions to knowledge are within the wider UQ&M area. The first contribution is related to requirements for an improved margin management policy, extracted from the FP7 European project, TOICA (Thermal Overall Integrated Conception of Aircraft). Margins are traditional means to mitigate the effect of uncertainty. They are relatively better understood and less intrusive in current design practice, compared with statistical approaches. The challenge tackled in this research has been to integrate uncertainty analysis with deterministic margin allocations, and to provide a method for exploration and trade-off studies. The proposed method incorporates sensitivity analysis, uncertainty propagation, and the set-based design paradigm. The resulting framework enables the designer to conduct systematic and interactive trade-offs between margins, performances and risks. Design case studies have been used to demonstrate the proposed method, which was partially evaluated in the TOICA project. The second contribution addresses the industrial need to properly ‘allocate’ uncertainty during the design process. The problem is to estimate how much uncertainty could be tolerated from different sources, given the acceptable level of uncertainty associated with the system outputs. Accordingly, a method for inverse uncertainty propagation has been developed. It is enabled by a fast forward propagation technique and a workflow reversal capability. This part of the research also forms a contribution to the TOICA project, where the proposed method was applied on several test-cases. Its usefulness was evaluated and confirmed through the project review process. The third contribution relates to the reduction of UQ&M computational cost, which has always been a burden in practice. To address this problem, an efficient sensitivity analysis method is proposed. It is based on the reformulation and approximation of Sobol’s indices with a quadrature technique. The objective is to reduce the number of model evaluations. The usefulness of the proposed method has been demonstrated by means of analytical and practical test-cases. Despite some limitations for several specific highly non-linear cases, the tests confirmed significant improvement in computational efficiency for high dimensional problems, compared with traditional methods. In conclusion, this research has led to novel UQ&M tools and techniques, for improved decision making in computational engineering design. The usefulness of these methods with regard to efficiency and interactivity has been demonstrated through relevant test-cases and qualitative evaluation by (industrial) experts. Finally, it is argued that future work in this field should involve research and development of a comprehensive framework, which is able to accommodate uncertainty, not only with regard to computation, but also from the perspective of (expert) knowledge and assumptions

    Decoupled UMDO formulation for interdisciplinary coupling satisfaction under uncertainty

    Get PDF
    International audienceAt early design phases, taking into account uncertainty for the optimization of a multidisciplinary system is essential to establish the optimal system characteristics and performances. Uncertainty Multidisciplinary Design Optimization (UMDO) formulations have to eciently organize the dierent disciplinary analyses, the uncertainty propagation, the optimization, but also the handling of interdisciplinary couplings under uncertainty. A decoupled UMDO formulation (Individual Discipline Feasible - Polynomial Chaos Expansion) ensuring the coupling satisfaction for all the instantiations of the uncertain variables is presented in this paper. Ensuring coupling satisfaction in instantiations is essential to ensure the equivalence between the coupled and decoupled UMDO problem formulations. The proposed approach relies on the iterative construction of surrogate models based on Polynomial Chaos Expansion in order to represent at the convergence of the optimization problem, the coupling functional relations as a coupled approach under uncertainty does. The performances of the proposed formulation is assessed on an analytic test case and on the design of a new Vega launch vehicle upper stage

    Quantile-based optimization under uncertainties using adaptive Kriging surrogate models

    Full text link
    Uncertainties are inherent to real-world systems. Taking them into account is crucial in industrial design problems and this might be achieved through reliability-based design optimization (RBDO) techniques. In this paper, we propose a quantile-based approach to solve RBDO problems. We first transform the safety constraints usually formulated as admissible probabilities of failure into constraints on quantiles of the performance criteria. In this formulation, the quantile level controls the degree of conservatism of the design. Starting with the premise that industrial applications often involve high-fidelity and time-consuming computational models, the proposed approach makes use of Kriging surrogate models (a.k.a. Gaussian process modeling). Thanks to the Kriging variance (a measure of the local accuracy of the surrogate), we derive a procedure with two stages of enrichment of the design of computer experiments (DoE) used to construct the surrogate model. The first stage globally reduces the Kriging epistemic uncertainty and adds points in the vicinity of the limit-state surfaces describing the system performance to be attained. The second stage locally checks, and if necessary, improves the accuracy of the quantiles estimated along the optimization iterations. Applications to three analytical examples and to the optimal design of a car body subsystem (minimal mass under mechanical safety constraints) show the accuracy and the remarkable efficiency brought by the proposed procedure

    Stochastic turbulence modeling in RANS simulations via Multilevel Monte Carlo

    Get PDF
    A multilevel Monte Carlo (MLMC) method for quantifying model-form uncertainties associated with the Reynolds-Averaged Navier-Stokes (RANS) simulations is presented. Two, high-dimensional, stochastic extensions of the RANS equations are considered to demonstrate the applicability of the MLMC method. The first approach is based on global perturbation of the baseline eddy viscosity field using a lognormal random field. A more general second extension is considered based on the work of [Xiao et al.(2017)], where the entire Reynolds Stress Tensor (RST) is perturbed while maintaining realizability. For two fundamental flows, we show that the MLMC method based on a hierarchy of meshes is asymptotically faster than plain Monte Carlo. Additionally, we demonstrate that for some flows an optimal multilevel estimator can be obtained for which the cost scales with the same order as a single CFD solve on the finest grid level.Comment: 40 page
    corecore