336 research outputs found

    Multi-Information Source Fusion and Optimization to Realize ICME: Application to Dual Phase Materials

    Get PDF
    Integrated Computational Materials Engineering (ICME) calls for the integration of computational tools into the materials and parts development cycle, while the Materials Genome Initiative (MGI) calls for the acceleration of the materials development cycle through the combination of experiments, simulation, and data. As they stand, both ICME and MGI do not prescribe how to achieve the necessary tool integration or how to efficiently exploit the computational tools, in combination with experiments, to accelerate the development of new materials and materials systems. This paper addresses the first issue by putting forward a framework for the fusion of information that exploits correlations among sources/models and between the sources and `ground truth'. The second issue is addressed through a multi-information source optimization framework that identifies, given current knowledge, the next best information source to query and where in the input space to query it via a novel value-gradient policy. The querying decision takes into account the ability to learn correlations between information sources, the resource cost of querying an information source, and what a query is expected to provide in terms of improvement over the current state. The framework is demonstrated on the optimization of a dual-phase steel to maximize its strength-normalized strain hardening rate. The ground truth is represented by a microstructure-based finite element model while three low fidelity information sources---i.e. reduced order models---based on different homogenization assumptions---isostrain, isostress and isowork---are used to efficiently and optimally query the materials design space.Comment: 19 pages, 11 figures, 5 table

    Efficient model-correction-based reliability analysis of uncertain dynamical systems

    Get PDF
    The scope of this paper is to apply a model-correction-based strategy for efficient reliability analysis of uncertain dynamical systems based on a low-fidelity (LF) model whose outcomes are corrected in a probabilistic sense to represent the more realistic outcomes of a high-fidelity (HF) model. In the model-correction approach utilized, the LF model is calibrated to the HF model close to the so-called most probable point in standard normal space, which allows a more realistic assessment of the considered complex dynamical system. Since only few expensive limit state function evaluations of the HF model are required, an efficient reliability analysis is enabled. In an application example, the LF model describes an existing single-span railway bridge modelled as simply supported Euler–Bernoulli beam subjected to moving single forces representing the axle loads of a moving train. The HF modelling approach accounts for the bridge–train interaction by modelling the passing train as mass-spring-damper system, however increasing the computational effort of the limit state function evaluations. Failure probabilities evaluated with the model-correction approach are contrasted and discussed with failure probabilities of the sophisticated bridge–train interaction model evaluated with the first-order reliability method (FORM). It is demonstrated that the efficiency of the method depends on the correlation between the LF and the HF model. A comparison of the results of FORM and the model-correction-based approach shows that the latter provides reliable failure probability prediction of the HF model while leading to a significant reduction in computational effort

    Analytical Benchmark Problems for Multifidelity Optimization Methods

    Get PDF
    The paper presents a collection of analytical benchmark problems specifically selected to provide a set of stress tests for the assessment of multifidelity optimization methods. In addition, the paper discusses a comprehensive ensemble of metrics and criteria recommended for the rigorous and meaningful assessment of the performance of multifidelity strategies and algorithms

    Multifidelity prediction in wildfire spread simulation: Modeling, uncertainty quantification and sensitivity analysis

    Get PDF
    Wildfire behavior predictions typically suffer from significant uncertainty. However, wildfire modeling uncertainties remain largely unquantified in the literature, mainly due to computing constraints. New multifidelity techniques provide a promising opportunity to overcome these limitations. Therefore, this paper explores the applicability of multifidelity approaches to wildland fire spread prediction problems. Using a canonical simulation scenario, we assessed the performance of control variates Monte-Carlo (MC) and multilevel MC strategies, achieving speedups of up to 100x in comparison to a standard MC method. This improvement was leveraged to quantify aleatoric uncertainties and analyze the sensitivity of the fire rate of spread (RoS) to weather and fuel parameters using a full-physics fire model, namely the Wildland-Urban Interface Fire Dynamics Simulator (WFDS), at an affordable computation cost. The proposed methodology may also be used to analyze uncertainty in other relevant fire behavior metrics such as heat transfer, fuel consumption and smoke production indicators

    Multi-Fidelity Methods for Optimization: A Survey

    Full text link
    Real-world black-box optimization often involves time-consuming or costly experiments and simulations. Multi-fidelity optimization (MFO) stands out as a cost-effective strategy that balances high-fidelity accuracy with computational efficiency through a hierarchical fidelity approach. This survey presents a systematic exploration of MFO, underpinned by a novel text mining framework based on a pre-trained language model. We delve deep into the foundational principles and methodologies of MFO, focusing on three core components -- multi-fidelity surrogate models, fidelity management strategies, and optimization techniques. Additionally, this survey highlights the diverse applications of MFO across several key domains, including machine learning, engineering design optimization, and scientific discovery, showcasing the adaptability and effectiveness of MFO in tackling complex computational challenges. Furthermore, we also envision several emerging challenges and prospects in the MFO landscape, spanning scalability, the composition of lower fidelities, and the integration of human-in-the-loop approaches at the algorithmic level. We also address critical issues related to benchmarking and the advancement of open science within the MFO community. Overall, this survey aims to catalyze further research and foster collaborations in MFO, setting the stage for future innovations and breakthroughs in the field.Comment: 47 pages, 9 figure

    A review of surrogate models and their application to groundwater modeling

    Get PDF
    The spatially and temporally variable parameters and inputs to complex groundwater models typically result in long runtimes which hinder comprehensive calibration, sensitivity, and uncertainty analysis. Surrogate modeling aims to provide a simpler, and hence faster, model which emulates the specified output of a more complex model in function of its inputs and parameters. In this review paper, we summarize surrogate modeling techniques in three categories: data-driven, projection, and hierarchical-based approaches. Data-driven surrogates approximate a groundwater model through an empirical model that captures the input-output mapping of the original model. Projection-based models reduce the dimensionality of the parameter space by projecting the governing equations onto a basis of orthonormal vectors. In hierarchical or multifidelity methods the surrogate is created by simplifying the representation of the physical system, such as by ignoring certain processes, or reducing the numerical resolution. In discussing the application to groundwater modeling of these methods, we note several imbalances in the existing literature: a large body of work on data-driven approaches seemingly ignores major drawbacks to the methods; only a fraction of the literature focuses on creating surrogates to reproduce outputs of fully distributed groundwater models, despite these being ubiquitous in practice; and a number of the more advanced surrogate modeling methods are yet to be fully applied in a groundwater modeling context
    • …
    corecore