385 research outputs found
Evolutionary model type selection for global surrogate modeling
Due to the scale and computational complexity of currently used simulation codes, global surrogate (metamodels) models have become indispensable tools for exploring and understanding the design space. Due to their compact formulation they are cheap to evaluate and thus readily facilitate visualization, design space exploration, rapid prototyping, and sensitivity analysis. They can also be used as accurate building blocks in design packages or larger simulation environments. Consequently, there is great interest in techniques that facilitate the construction of such approximation models while minimizing the computational cost and maximizing model accuracy. Many surrogate model types exist ( Support Vector Machines, Kriging, Neural Networks, etc.) but no type is optimal in all circumstances. Nor is there any hard theory available that can help make this choice. In this paper we present an automatic approach to the model type selection problem. We describe an adaptive global surrogate modeling environment with adaptive sampling, driven by speciated evolution. Different model types are evolved cooperatively using a Genetic Algorithm ( heterogeneous evolution) and compete to approximate the iteratively selected data. In this way the optimal model type and complexity for a given data set or simulation code can be dynamically determined. Its utility and performance is demonstrated on a number of problems where it outperforms traditional sequential execution of each model type
Adaptive active subspace-based metamodeling for high-dimensional reliability analysis
To address the challenges of reliability analysis in high-dimensional
probability spaces, this paper proposes a new metamodeling method that couples
active subspace, heteroscedastic Gaussian process, and active learning. The
active subspace is leveraged to identify low-dimensional salient features of a
high-dimensional computational model. A surrogate computational model is built
in the low-dimensional feature space by a heteroscedastic Gaussian process.
Active learning adaptively guides the surrogate model training toward the
critical region that significantly contributes to the failure probability. A
critical trait of the proposed method is that the three main ingredients-active
subspace, heteroscedastic Gaussian process, and active learning-are coupled to
adaptively optimize the feature space mapping in conjunction with the surrogate
modeling. This coupling empowers the proposed method to accurately solve
nontrivial high-dimensional reliability problems via low-dimensional surrogate
modeling. Finally, numerical examples of a high-dimensional nonlinear function
and structural engineering applications are investigated to verify the
performance of the proposed method
Recommended from our members
MODEL-BASED PREDICTIVE ANALYTICS FOR ADDITIVE AND SMART MANUFACTURING
Qualification and certification for additive and smart manufacturing systems can be uncertain and very costly. Using available historical data can mitigate some costs of producing and testing sample parts. However, use of such data lacks the flexibility to represent specific new problems which decreases predictive accuracy and efficiency. To address these compelling needs, in this dissertation modeling techniques are introduced that can proactively estimate results expected from additive and smart manufacturing processes swiftly and with practical levels of accuracy and reliability. More specifically, this research addresses the current challenges and limitations posed by use of available data and the high costs of new data by tailoring statistics-based metamodeling techniques to enable affordable prediction of these systems.
The result is an integrated approach to customize and build predictive metamodels for the unique features of additive and smart manufacturing systems. This integrated approach is composed of five main parts that cover the broad spectrum of requirements. A domain-driven metamodeling approach uses physics-based knowledge to optimally select the most appropriate metamodeling algorithm without reliance upon statistical data. A maximum predictive error updating method iteratively improves predictability from a given dataset. A grey-box metamodeling approach combines statistics-based black-box and physics-based white-box models to significantly increase predictive accuracy with less expensive data overall. To improve computational efficiency for large datasets, a dynamic metamodeling method modifies the traditional Kriging technique to improve its efficiency and predictability for smart manufacturing systems. Finally, a super-metamodeling method optimizes results regardless of problem conditions by avoiding the challenge with selecting the most appropriate metamodeling algorithm.
To realize the benefits of all five approaches, an integrated metamodeling process was developed and implemented into a tool package to systematically select the suitable algorithm, sampling method, and combination of models. All the functions of this tool package were validated and demonstrated by the use of two empirical datasets from additive manufacturing processes
Digital-Twins towards Cyber-Physical Systems: A Brief Survey
Cyber-Physical Systems (CPS) are integrations of computation and physical processes. Physical processes are monitored and controlled by embedded computers and networks, which frequently have feedback loops where physical processes affect computations and vice versa. To ease the analysis of a system, the costly physical plants can be replaced by the high-fidelity virtual models that provide a framework for Digital-Twins (DT). This paper aims to briefly review the state-of-the-art and recent developments in DT and CPS. Three main components in CPS, including communication, control, and computation, are reviewed. Besides, the main tools and methodologies required for implementing practical DT are discussed by following the main applications of DT in the fourth industrial revolution through aspects of smart manufacturing, sixth wireless generation (6G), health, production, energy, and so on. Finally, the main limitations and ideas for future remarks are talked about followed by a short guideline for real-world application of DT towards CPS
Recommended from our members
A data driven approach in less expensive robust transmitting coverage and power optimization
This paper aims the development of a new reduced-cost algorithm for a multi-objective robust transmitter placement under uncertainty. Toward this end, we propose a new hybrid Kriging/Grey Wolf Optimizer (GWO) approach combined with robust design optimization to estimate the set of Pareto frontier by searching robustness as well as accuracy (lower objective function) in a design space. We consider minimization of the energy power consumption for transmitting as well as maximization of signal coverage in a multi-objective robust optimization model. The reliability of the model to control signal overlap for multiple transmitting antennas is also provided. To smooth computational cost, the proposed method instead of evaluating all receiver test points in each optimization iteration approximates signal coverages using Kriging interpolation to obtain optimal transmitter positions. The results demonstrate the utility and the efficiency of the proposed method in rendering the robust optimal design and analyzing the sensitivity of the transmitter placement problem under practically less-expensive computational efforts (350% and 320% less than computational time elapsed using standalone GWO and NSGAII respectively)
Multiobjective Simulation Optimization Using Enhanced Evolutionary Algorithm Approaches
In today\u27s competitive business environment, a firm\u27s ability to make the correct, critical decisions can be translated into a great competitive advantage. Most of these critical real-world decisions involve the optimization not only of multiple objectives simultaneously, but also conflicting objectives, where improving one objective may degrade the performance of one or more of the other objectives. Traditional approaches for solving multiobjective optimization problems typically try to scalarize the multiple objectives into a single objective. This transforms the original multiple optimization problem formulation into a single objective optimization problem with a single solution. However, the drawbacks to these traditional approaches have motivated researchers and practitioners to seek alternative techniques that yield a set of Pareto optimal solutions rather than only a single solution. The problem becomes much more complicated in stochastic environments when the objectives take on uncertain (or noisy ) values due to random influences within the system being optimized, which is the case in real-world environments. Moreover, in stochastic environments, a solution approach should be sufficiently robust and/or capable of handling the uncertainty of the objective values. This makes the development of effective solution techniques that generate Pareto optimal solutions within these problem environments even more challenging than in their deterministic counterparts. Furthermore, many real-world problems involve complicated, black-box objective functions making a large number of solution evaluations computationally- and/or financially-prohibitive. This is often the case when complex computer simulation models are used to repeatedly evaluate possible solutions in search of the best solution (or set of solutions). Therefore, multiobjective optimization approaches capable of rapidly finding a diverse set of Pareto optimal solutions would be greatly beneficial. This research proposes two new multiobjective evolutionary algorithms (MOEAs), called fast Pareto genetic algorithm (FPGA) and stochastic Pareto genetic algorithm (SPGA), for optimization problems with multiple deterministic objectives and stochastic objectives, respectively. New search operators are introduced and employed to enhance the algorithms\u27 performance in terms of converging fast to the true Pareto optimal frontier while maintaining a diverse set of nondominated solutions along the Pareto optimal front. New concepts of solution dominance are defined for better discrimination among competing solutions in stochastic environments. SPGA uses a solution ranking strategy based on these new concepts. Computational results for a suite of published test problems indicate that both FPGA and SPGA are promising approaches. The results show that both FPGA and SPGA outperform the improved nondominated sorting genetic algorithm (NSGA-II), widely-considered benchmark in the MOEA research community, in terms of fast convergence to the true Pareto optimal frontier and diversity among the solutions along the front. The results also show that FPGA and SPGA require far fewer solution evaluations than NSGA-II, which is crucial in computationally-expensive simulation modeling applications
Surrogate-assisted reliability-based design optimization: a survey and a new general framework
International audienc
Rotational and ply-level uncertainty in response of composite shallow conical shells
This paper presents the quantification of rotational and ply level uncertainty of random natural frequency for laminated composite conical shells by using surrogate modeling approach. The stochastic eigenvalue problem is solved by using QR iteration algorithm. Sensitivity analysis is carried out to address the influence of different input parameters on the output natural frequencies. The sampling size and computational cost is reduced by employing the present approach compared to direct Monte Carlo simulation. The stochastic mode shapes are also depicted for a typical laminate configuration. Statistical analysis is presented to illustrate the results and its performance
Recommended from our members
Computational automation for efficient design of acoustic metamaterials
Acoustic metamaterials (AMMs) are an exciting technology because they are capable of responding to vibrations in ways that are impossible to achieve with conventional materials. However, realization of AMMs requires engineering design to provide a connection between first-principles research and production of parts that perform as expected. Designing AMMs is a challenging endeavor because evaluating designs is costly and manufacturing metamaterials requires precise techniques with small minimum resolutions. To address these challenges, new computational tools are necessary to aid design. This work proposes three tasks that improve the capabilities of design for AMM while being extensible to other engineering design automation tasks. The first task is to develop a design exploration tool that improves the computational efficiency of identifying sets of high-performing designs in a design space that is sparse and comprises mixed discrete/continuous data. The second task is to develop a process for designers to evaluate manufacturability of difficult-to-manufacture parts and drive co-development of manufacturing methods and AMM. In the final task, a machine learning based method is developed to efficiently model AMM with heterogeneous arrangements of their microstructures such that strict homogenization is infeasible. The outcomes from completing these tasks will provide a significant and novel improvement over existing methods of designing AMMs.Mechanical Engineerin
Robust optimal design of FOPID controller for five bar linkage robot in a cyber-physical system: a new simulation-optimization approach
This paper aims to further increase the reliability of optimal results by setting the simulation conditions to be as close as possible to the real or actual operation to create a Cyber-Physical System (CPS) view for the installation of the Fractional-Order PID (FOPID) controller. For this purpose, we consider two different sources of variability in such a CPS control model. The first source refers to the changeability of a target of the control model (multiple setpoints) because of environmental noise factors and the second source refers to an anomaly in sensors that is raised in a feedback loop. We develop a new approach to optimize two objective functions under uncertainty including signal energy control and response error control while obtaining the robustness among the source of variability with the lowest computational cost. A new hybrid surrogate-metaheuristic approach is developed using Particle Swarm Optimization (PSO) to update the Gaussian Process (GP) surrogate for a sequential improvement of the robust optimal result. The application of efficient global optimization is extended to estimate surrogate prediction error with less computational cost using a jackknife leave-one-out estimator. This paper examines the challenges of such a robust multi-objective optimization for FOPID control of a five-bar linkage robot manipulator. The results show the applicability and effectiveness of our proposed method in obtaining robustness and reliability in a CPS control system by tackling required computational efforts
- …