244 research outputs found
A PC-Kriging-HDMR integrated with an adaptive sequential sampling strategy for high-dimensional approximate modeling
High-dimensional complex multi-parameter problems are prevalent in
engineering, exceeding the capabilities of traditional surrogate models
designed for low/medium-dimensional problems. These models face the curse of
dimensionality, resulting in decreased modeling accuracy as the design
parameter space expands. Furthermore, the lack of a parameter decoupling
mechanism hinders the identification of couplings between design variables,
particularly in highly nonlinear cases. To address these challenges and enhance
prediction accuracy while reducing sample demand, this paper proposes a
PC-Kriging-HDMR approximate modeling method within the framework of Cut-HDMR.
The method leverages the precision of PC-Kriging and optimizes test point
placement through a multi-stage adaptive sequential sampling strategy. This
strategy encompasses a first-stage adaptive proportional sampling criterion and
a second-stage central-based maximum entropy criterion. Numerical tests and a
practical application involving a cantilever beam demonstrate the advantages of
the proposed method. Key findings include: (1) The performance of traditional
single-surrogate models, such as Kriging, significantly deteriorates in
high-dimensional nonlinear problems compared to combined surrogate models under
the Cut-HDMR framework (e.g., Kriging-HDMR, PCE-HDMR, SVR-HDMR, MLS-HDMR, and
PC-Kriging-HDMR); (2) The number of samples required for PC-Kriging-HDMR
modeling increases polynomially rather than exponentially as the parameter
space expands, resulting in substantial computational cost reduction; (3) Among
existing Cut-HDMR methods, no single approach outperforms the others in all
aspects. However, PC-Kriging-HDMR exhibits improved modeling accuracy and
efficiency within the desired improvement range compared to PCE-HDMR and
Kriging-HDMR, demonstrating robustness.Comment: 17 pages with 7 figures and 9 table
Multifidelity Modeling by Polynomial Chaos-Based Cokriging to Enable Efficient Model-Based Reliability Analysis of NDT Systems
This work proposes a novel multifidelity metamodeling approach, the polynomial chaos-based Cokriging (PC-Cokriging). The proposed approach is used for fast uncertainty propagation in a reliability analysis of nondestructive testing systems using model-assisted probability of detection (MAPOD). In particular, PC-Cokriging is a multivariate version of polynomial chaos-based Kriging (PC-Kriging), which aims at combining the advantages of the regression-based polynomial chaos expansions and the interpolation-based Kriging metamodeling methods. Following a similar process as Cokriging, the PC-Cokriging advances PC-Kriging by enabling the incorporation of multifidelity physics information. The proposed PC-Cokriging is demonstrated on two analytical functions and three ultrasonic testing MAPOD cases. The results show that PC-Cokriging outperforms the state-of-the-art metamodeling approaches when providing the same number of training points. Specifically, PC-Cokriging reduces the high-fidelity training sample cost of the Kriging and PCE metamodels by over one order of magnitude, and the PC-Kriging and conventional Cokriging multifidelity metamodeling by up to 50 % to reach the same accuracy level (defined by the root mean squared error being no greater than 1 % of the standard deviation of the testing points). The accuracy and robustness of the proposed method of the key MAPOD metrics versus various detection thresholds are investigated and satisfactory results are obtained
Development of reduced polynomial chaos-Kriging metamodel for uncertainty quantification of computational aerodynamics
2018 Summer.Includes bibliographical references.Computational fluid dynamics (CFD) simulations are a critical component of the design and development of aerodynamic bodies. However, as engineers attempt to capture more detailed physics, the computational cost of simulations increases. This limits the ability of engineers to use robust or multidisciplinary design methodologies for practical engineering applications because the computational model is too expensive to evaluate for uncertainty quantification studies and off-design performance analysis. Metamodels (surrogate models) are a closed-form mathematical solution fit to only a few simulation responses which can be used to remedy this situation by estimating off-design performance and stochastic responses of the CFD simulation for far less computational cost. The development of a reduced polynomial chaos-Kriging (RPC-K) metamodel is another step towards eliminating simulation gridlock by capturing the relevant physics of the problem in a cheap-to-evaluate metamodel using fewer CFD simulations. The RPC-K metamodel is superior to existing technologies because its model reduction methodology eliminates the design parameters which contribute little variance to the problem before fitting a high-fidelity metamodel to the remaining data. This metamodel can capture non-linear physics due to its inclusion of both the long-range trend information of a polynomial chaos expansion and local variations in the simulation data through Kriging. In this thesis, the RPC-K metamodel is developed, validated on a convection-diffusion-reaction problem, and applied to the NACA 4412 airfoil and aircraft engine nacelle problems. This research demonstrates the metamodel's effectiveness over existing polynomial chaos and Kriging metamodels for aerodynamics applications because of its ability to fit non-linear fluid flows with far fewer CFD simulations. This research will allow aerospace engineers to more effectively take advantage of detailed CFD simulations in the development of next-generation aerodynamic bodies through the use of the RPC-K metamodel to save computational cost
A Multi-Fidelity Successive Response Surface Method for Crashworthiness Optimization Problems
Due to the high computational burden and the high non-linearity of the responses, crashworthiness optimizations are notoriously hard-to-solve challenges. Among various approaches, methods like the Successive Response Surface Method (SRSM) have stood out for their efficiency in
enhancing baseline designs within a few iterations. However, these methods have limitations that restrict their application. Their minimum iterative resampling required is often computationally prohibitive. Furthermore, surrogate models are conventionally constructed using Polynomial Response Surface (PRS), a method that is poorly versatile, prone to overfitting, and incapable of quantifying uncertainty. Furthermore, the lack of continuity between successive response surfaces results in suboptimal predictions. This paper introduces the Multi-Fidelity Successive Response Surface (MF-SRS), a Gaussian process-based method, which leverages a non-linear multi-fidelity approach for more accurate and efficient predictions compared to SRSM. After initial testing on synthetic problems, this method is applied to a real-world crashworthiness task: optimizing a bumper cross member and crash box system. The results, benchmarked against SRSM and the Gaussian Process Successive
Response Surface (GP-SRS), a single-fidelity Gaussian process-driven extension of SRSM, show that MF-SRS offers distinct advantages. Specifically, it improves upon the specific energy absorbed optimum value achieved by SRSM by 14%, revealing its potential for future applications
Recommended from our members
Computational automation for efficient design of acoustic metamaterials
Acoustic metamaterials (AMMs) are an exciting technology because they are capable of responding to vibrations in ways that are impossible to achieve with conventional materials. However, realization of AMMs requires engineering design to provide a connection between first-principles research and production of parts that perform as expected. Designing AMMs is a challenging endeavor because evaluating designs is costly and manufacturing metamaterials requires precise techniques with small minimum resolutions. To address these challenges, new computational tools are necessary to aid design. This work proposes three tasks that improve the capabilities of design for AMM while being extensible to other engineering design automation tasks. The first task is to develop a design exploration tool that improves the computational efficiency of identifying sets of high-performing designs in a design space that is sparse and comprises mixed discrete/continuous data. The second task is to develop a process for designers to evaluate manufacturability of difficult-to-manufacture parts and drive co-development of manufacturing methods and AMM. In the final task, a machine learning based method is developed to efficiently model AMM with heterogeneous arrangements of their microstructures such that strict homogenization is infeasible. The outcomes from completing these tasks will provide a significant and novel improvement over existing methods of designing AMMs.Mechanical Engineerin
- …