1,520 research outputs found

    Developing Efficient Strategies For Global Sensitivity Analysis Of Complex Environmental Systems Models

    Get PDF
    Complex Environmental Systems Models (CESMs) have been developed and applied as vital tools to tackle the ecological, water, food, and energy crises that humanity faces, and have been used widely to support decision-making about management of the quality and quantity of Earth’s resources. CESMs are often controlled by many interacting and uncertain parameters, and typically integrate data from multiple sources at different spatio-temporal scales, which make them highly complex. Global Sensitivity Analysis (GSA) techniques have proven to be promising for deepening our understanding of the model complexity and interactions between various parameters and providing helpful recommendations for further model development and data acquisition. Aside from the complexity issue, the computationally expensive nature of the CESMs precludes effective application of the existing GSA techniques in quantifying the global influence of each parameter on variability of the CESMs’ outputs. This is because a comprehensive sensitivity analysis often requires performing a very large number of model runs. Therefore, there is a need to break down this barrier by the development of more efficient strategies for sensitivity analysis. The research undertaken in this dissertation is mainly focused on alleviating the computational burden associated with GSA of the computationally expensive CESMs through developing efficiency-increasing strategies for robust sensitivity analysis. This is accomplished by: (1) proposing an efficient sequential sampling strategy for robust sampling-based analysis of CESMs; (2) developing an automated parameter grouping strategy of high-dimensional CESMs, (3) introducing a new robustness measure for convergence assessment of the GSA methods; and (4) investigating time-saving strategies for handling simulation failures/crashes during the sensitivity analysis of computationally expensive CESMs. This dissertation provides a set of innovative numerical techniques that can be used in conjunction with any GSA algorithm and be integrated in model building and systems analysis procedures in any field where models are used. A range of analytical test functions and environmental models with varying complexity and dimensionality are utilized across this research to test the performance of the proposed methods. These methods, which are embedded in the VARS–TOOL software package, can also provide information useful for diagnostic testing, parameter identifiability analysis, model simplification, model calibration, and experimental design. They can be further applied to address a range of decision making-related problems such as characterizing the main causes of risk in the context of probabilistic risk assessment and exploring the CESMs’ sensitivity to a wide range of plausible future changes (e.g., hydrometeorological conditions) in the context of scenario analysis

    Application of Permutation Genetic Algorithm for Sequential Model Building–Model Validation Design of Experiments

    Get PDF
    YesThe work presented in this paper is motivated by a complex multivariate engineering problem associated with engine mapping experiments, which require efficient Design of Experiment (DoE) strategies to minimise expensive testing. The paper describes the development and evaluation of a Permutation Genetic Algorithm (PermGA) to support an exploration-based sequential DoE strategy for complex real-life engineering problems. A known PermGA was implemented to generate uniform OLH DoEs, and substantially extended to support generation of Model Building–Model Validation (MB-MV) sequences, by generating optimal infill sets of test points as OLH DoEs, that preserve good space filling and projection properties for the merged MB + MV test plan. The algorithm was further extended to address issues with non-orthogonal design spaces, which is a common problem in engineering applications. The effectiveness of the PermGA algorithm for the MB-MV OLH DoE sequence was evaluated through a theoretical benchmark problem based on the Six-Hump-Camel-Back (SHCB) function, as well as the Gasoline Direct Injection (GDI) engine steady state engine mapping problem that motivated this research. The case studies show that the algorithm is effective at delivering quasi-orthogonal space-filling DoEs with good properties even after several MB-MV iterations, while the improvement in model adequacy and accuracy can be monitored by the engineering analyst. The practical importance of this work, demonstrated through the engine case study, also is that significant reduction in the effort and cost of testing can be achieved.The research work presented in this paper was funded by the UK Technology Strategy Board (TSB) through the Carbon Reduction through Engine Optimization (CREO) project

    Probabilistic analysis of a coal mine roadway including correlation control between model input parameters

    Get PDF
    Probabilistic analysis and numerical modelling techniques have been combined to analyse a deep coal mine roadway. Using the Monte Carlo method, a correlation control algorithm and a FLAC 2D finite difference model, a probability distribution of roof displacements has been calculated and compared to a set of measurements from an actual mine roadway. The importance of correlation between input parameters is also considered. The results show that the analysis performs relatively well, but does tend to over predict the magnitude of displacements. Correlation between parameters is shown to be very important, particularly between the three model stresses

    Hydrodynamic performance optimization of semi-submersible floaters for offshore wind turbines

    Get PDF
    Floating structures have become viable alternatives for supporting wind turbines as offshore wind projects move deeper into the water. The wind is prevalent in deep water (depths > 60 m) all around the world. Because of the amount of potential at these depths, wind turbines will require the design of a floating platform, as current wind turbines are usually fixed at the bottom and rely on ordinary concrete with a gravity base, which is not practical at these depths. Floating offshore wind offers a huge potential for green energy production offshore and the overall energy transition to zero carbon emission in general. With the development of even larger wind turbines in the range beyond 15 MW, the floating concepts become more attractive and competitive from a cost perspective. However, larger turbines and cost optimization also require a re-thinking of established solutions and concepts. New ideas and innovations are required to optimize floating offshore wind farms further. An approach for the optimization of semi-submersible floaters using different surrogate models has been developed in this thesis. A semi-submersible floater is selected and designed to support a 15-MW wind turbine in the North Sea. The optimization framework consists of automatic modeling and numerical simulations in open-source tools as well as obtaining the Pareto fronts using surrogate models and the Genetic Algorithm in CEASES software. A Python-SALOME-NEMOH interface is used to obtain the hydrodynamic properties for geometries defined by various variables. The geometries are subjected to three performance constraints: the static platform pitch, metacentric height, nacelle acceleration, and wind. Loads in operating and parked conditions are considered. Finally, the geometries are optimized using two objective functions related to material cost and nacelle acceleration, and the results are discussed. This work contributes to developing efficient design optimization methods for floating structures

    Reactive power planning for regional power grids based on active and reactive power adjustments of DGs

    Get PDF
    To deal with extreme overvoltage scenarios with small probabilities in regional power grids, the traditional reactive power planning model requires a huge VAR compensator investment. Obviously, such a decision that makes a large investment to cope with a small probability event is not economic. Therefore, based on the scenario analysis of power outputs of distributed generations and load consumption, a novel reactive power planning model considering the active and reactive power adjustments of distributed generations is proposed to derive the optimal allocation of VAR compensators and ensure bus voltages within an acceptable range under extreme overvoltage scenarios. The objective of the proposed reactive power planning model is to minimize the VAR compensator investment cost and active power adjustment cost of distributed generations. Moreover, since the proposed reactive power planning model is formulated as a mixed-integer nonlinear programming problem, a primal-dual interior point method-based particle swarm optimization algorithm is developed to effectively solve the proposed model. Simulation results were conducted with the modified IEEE 30-bus system to verify the effectiveness of the proposed reactive power planning model

    HETEROGENEOUS UNCERTAINTY QUANTIFICATION FOR RELIABILITY-BASED DESIGN OPTIMIZATION

    Get PDF
    Uncertainty is inherent to real-world engineering systems, and reliability analysis aims at quantitatively measuring the probability that engineering systems successfully perform the intended functionalities under various sources of uncertainties. In this dissertation, heterogeneous uncertainties including input variation, data uncertainty, simulation model uncertainty, and time-dependent uncertainty have been taken into account in reliability analysis and reliability-based design optimization (RBDO). The input variation inherently exists in practical engineering system and can be characterized by statistical modeling methods. Data uncertainty occurs when surrogate models are constructed to replace the simulations or experiments based on a set of training data, while simulation model uncertainty is introduced when high-fidelity simulation models are built through idealizations and simplifications of real physical processes or systems. Time-dependent uncertainty is involved when considering system or component aging and deterioration. Ensuring a high level of system reliability is one of the critical targets for engineering design, and this dissertation studies effective reliability analysis and reliability-based design optimization (RBDO) techniques to address the challenges of heterogeneous uncertainties. First of all, a novel reliability analysis method is proposed to deal with input randomness and time-dependent uncertainty. An ensemble learning framework is designed by integrating the Long short-term memory (LSTM) and feedforward neural network. Time-series data is utilized to construct a surrogate model for capturing the time-dependent responses with respect to input variables and stochastic processes. Moreover, a RBDO framework with Kriging technique is presented to address the time-dependent uncertainty in design optimization. Limit state functions are transformed into time-independent domain by converting the stochastic processes and time parameter to random variables, and Kriging surrogate models are then built and enhanced by a design-driven adaptive sampling scheme to accurately identify potential instantaneous failure events. Secondly, an equivalent reliability index (ERI) method is proposed for handling both input variations and surrogate model uncertainty in RBDO. To account for the surrogate model uncertainty, a Gaussian mixture model is constructed based on Gaussian process model predictions. To propagate both input variations and surrogate model uncertainty into reliability analysis, the statistical moments of the GMM is utilized for calculating an equivalent reliability index. The sensitivity of ERI with respect to design variables is analytically derived to facilitate the surrogate model-based product design process, lead to reliable optimum solutions. Thirdly, different effective methods are developed to handle the simulation model uncertainty as well as the surrogate model uncertainty. An active resource allocation framework is proposed for accurate reliability analysis using both simulation and experimental data, where a two-phase updating strategy is developed for reducing the computational costs. The framework is further extended for RBDO problems, where multi-fidelity design algorithm is presented to ensure accurate optimum designs while minimizing the computational costs. To account for both the bias terms and unknown parameters in the simulation model, Bayesian inference method is adopted for building a validated surrogate model, and a Bayesian-based mixture modeling method is developed to ensure reliable system designs with the consideration of heterogeneous uncertainties
    • …
    corecore