250 research outputs found

    Uncertainty Analysis of the Adequacy Assessment Model of a Distributed Generation System

    Full text link
    Due to the inherent aleatory uncertainties in renewable generators, the reliability/adequacy assessments of distributed generation (DG) systems have been particularly focused on the probabilistic modeling of random behaviors, given sufficient informative data. However, another type of uncertainty (epistemic uncertainty) must be accounted for in the modeling, due to incomplete knowledge of the phenomena and imprecise evaluation of the related characteristic parameters. In circumstances of few informative data, this type of uncertainty calls for alternative methods of representation, propagation, analysis and interpretation. In this study, we make a first attempt to identify, model, and jointly propagate aleatory and epistemic uncertainties in the context of DG systems modeling for adequacy assessment. Probability and possibility distributions are used to model the aleatory and epistemic uncertainties, respectively. Evidence theory is used to incorporate the two uncertainties under a single framework. Based on the plausibility and belief functions of evidence theory, the hybrid propagation approach is introduced. A demonstration is given on a DG system adapted from the IEEE 34 nodes distribution test feeder. Compared to the pure probabilistic approach, it is shown that the hybrid propagation is capable of explicitly expressing the imprecision in the knowledge on the DG parameters into the final adequacy values assessed. It also effectively captures the growth of uncertainties with higher DG penetration levels

    Fuzzy Reliability Assessment of Systems with Multiple Dependent Competing Degradation Processes

    Get PDF
    International audienceComponents are often subject to multiple competing degradation processes. For multi-component systems, the degradation dependency within one component or/and among components need to be considered. Physics-based models (PBMs) and multi-state models (MSMs) are often used for component degradation processes, particularly when statistical data are limited. In this paper, we treat dependencies between degradation processes within a piecewise-deterministic Markov process (PDMP) modeling framework. Epistemic (subjective) uncertainty can arise due to the incomplete or imprecise knowledge about the degradation processes and the governing parameters: to take into account this, we describe the parameters of the PDMP model as fuzzy numbers. Then, we extend the finite-volume (FV) method to quantify the (fuzzy) reliability of the system. The proposed method is tested on one subsystem of the residual heat removal system (RHRS) of a nuclear power plant, and a comparison is offered with a Monte Carlo (MC) simulation solution: the results show that our method can be most efficient

    Decision-Making under Uncertainty: Optimal Storm Sewer Network Design Considering Flood Risk

    Get PDF
    Storm sewer systems play a very important role in urban areas. The design of a storm sewer system should be based on an appropriate level of preventing flooding. This thesis focuses on issues relevant to decision-making in storm sewer network design considering flood risk. Uncertainty analysis is often required in an integrated approach to a comprehensive assessment of flood risk. The first part of this thesis discusses the understanding and representation of uncertainty in general setting. It also develops methods for propagating uncertainty through a model under different situations when uncertainties are represented by various mathematical languages. The decision-making process for storm sewer network design considering flood risk is explored in this thesis. The pipe sizes and slopes of the network are determined for the design. Due to the uncertain character of the flood risk, the decision made is not unique but depends on the decision maker’s attitude towards risk. A flood risk based storm sewer network design method incorporating a multiple-objective optimization and a “choice” process is developed with different design criteria. The storm sewer network design considering flood risk can also be formed as a single-objective optimization provided that the decision criterion is given a priori. A framework for this approach with a single objective optimization is developed. The GA is adapted as the optimizer. The flood risk is evaluated with different methods either under several design storms or using sampling method. A method for generating samples represented by correlated variables is introduced. It is adapted from a literature method providing that the marginal distributions of variables as well as the correlations between them are known. The group method is developed aiming to facilitate the generation of correlated samples of large sizes. The method is successfully applied to the generation of rainfall event samples and the samples are used for storm sewer network design where the flood risk is evaluated with rainfall event samples

    Uncertainty management in multidisciplinary design of critical safety systems

    Get PDF
    Managing the uncertainty in multidisciplinary design of safety-critical systems requires not only the availability of a single approach or methodology to deal with uncertainty but a set of different strategies and scalable computational tools (that is, by making use of the computational power of a cluster and grid computing). The availability of multiple tools and approaches for dealing with uncertainties allows cross validation of the results and increases the confidence in the performed analysis. This paper presents a unified theory and an integrated and open general-purpose computational framework to deal with scarce data, and aleatory and epistemic uncertainties. It allows solving of the different tasks necessary to manage the uncertainty, such as uncertainty characterization, sensitivity analysis, uncertainty quantification, and robust design. The proposed computational framework is generally applicable to solve different problems in different fields and be numerically efficient and scalable, allowing for a significant reduction of the computational time required for uncertainty management and robust design. The applicability of the proposed approach is demonstrated by solving a multidisciplinary design of a critical system proposed by NASA Langley Research Center in the multidisciplinary uncertainty quantification challenge problem

    Forward uncertainty quantification with special emphasis on a Bayesian active learning perspective

    Get PDF
    Uncertainty quantification (UQ) in its broadest sense aims at quantitatively studying all sources of uncertainty arising from both computational and real-world applications. Although many subtopics appear in the UQ field, there are typically two major types of UQ problems: forward and inverse uncertainty propagation. The present study focuses on the former, which involves assessing the effects of the input uncertainty in various forms on the output response of a computational model. In total, this thesis reports nine main developments in the context of forward uncertainty propagation, with special emphasis on a Bayesian active learning perspective. The first development is concerned with estimating the extreme value distribution and small first-passage probabilities of uncertain nonlinear structures under stochastic seismic excitations, where a moment-generating function-based mixture distribution approach (MGF-MD) is proposed. As the second development, a triple-engine parallel Bayesian global optimization (T-PBGO) method is presented for interval uncertainty propagation. The third contribution develops a parallel Bayesian quadrature optimization (PBQO) method for estimating the response expectation function, its variable importance and bounds when a computational model is subject to hybrid uncertainties in the form of random variables, parametric probability boxes (p-boxes) and interval models. In the fourth research, of interest is the failure probability function when the inputs of a performance function are characterized by parametric p-boxes. To do so, an active learning augmented probabilistic integration (ALAPI) method is proposed based on offering a partially Bayesian active learning perspective on failure probability estimation, as well as the use of high-dimensional model representation (HDMR) technique. Note that in this work we derive an upper-bound of the posterior variance of the failure probability, which bounds our epistemic uncertainty about the failure probability due to a kind of numerical uncertainty, i.e., discretization error. The fifth contribution further strengthens the previously developed active learning probabilistic integration (ALPI) method in two ways, i.e., enabling the use of parallel computing and enhancing the capability of assessing small failure probabilities. The resulting method is called parallel adaptive Bayesian quadrature (PABQ). The sixth research presents a principled Bayesian failure probability inference (BFPI) framework, where the posterior variance of the failure probability is derived (not in closed form). Besides, we also develop a parallel adaptive-Bayesian failure probability learning (PA-BFPI) method upon the BFPI framework. For the seventh development, we propose a partially Bayesian active learning line sampling (PBAL-LS) method for assessing extremely small failure probabilities, where a partially Bayesian active learning insight is offered for the classical LS method and an upper-bound for the posterior variance of the failure probability is deduced. Following the PBAL-LS method, the eighth contribution finally obtains the expression of the posterior variance of the failure probability in the LS framework, and a Bayesian active learning line sampling (BALLS) method is put forward. The ninth contribution provides another Bayesian active learning alternative, Bayesian active learning line sampling with log-normal process (BAL-LS-LP), to the traditional LS. In this method, the log-normal process prior, instead of a Gaussian process prior, is assumed for the beta function so as to account for the non-negativity constraint. Besides, the approximation error resulting from the root-finding procedure is also taken into consideration. In conclusion, this thesis presents a set of novel computational methods for forward UQ, especially from a Bayesian active learning perspective. The developed methods are expected to enrich our toolbox for forward UQ analysis, and the insights gained can stimulate further studies

    Stochastic simulation methods for structural reliability under mixed uncertainties

    Get PDF
    Uncertainty quantification (UQ) has been widely recognized as one of the most important, yet challenging task in both structural engineering and system engineering, and the current researches are mainly on the proper treatment of different types of uncertainties, resulting from either natural randomness or lack of information, in all related sub-problems of UQ such as uncertainty characterization, uncertainty propagation, sensitivity analysis, model updating, model validation, risk and reliability analysis, etc. It has been widely accepted that those uncertainties can be grouped as either aleatory uncertainty or epistemic uncertainty, depending on whether they are reducible or not. For dealing with the above challenge, many non-traditional uncertainty characterization models have been developed, and those models can be grouped as either imprecise probability models (e.g., probability-box model, evidence theory, second-order probability model and fuzzy probability model) or non-probabilistic models (e.g., interval/convex model and fuzzy set theory). This thesis concerns the efficient numerical propagation of the three kinds of uncertainty characterization models, and for simplicity, the precise probability model, the distribution probability-box model, and the interval model are taken as examples. The target is to develop efficient numerical algorithms for learning the functional behavior of the probabilistic responses (e.g., response moments and failure probability) with respect to the epistemic parameters of model inputs, which is especially useful for making reliable decisions even when the available information on model inputs is imperfect. To achieve the above target, my thesis presents three main developments for improving the Non-intrusive Imprecise Stochastic Simulation (NISS), which is a general methodology framework for propagating the imprecise probability models with only one stochastic simulation. The first development is on generalizing the NISS methods to the problems with inputs including both imprecise probability models and non-probability models. The algorithm is established by combining Bayes rule and kernel density estimation. The sensitivity indices of the epistemic parameters are produced as by-products. The NASA Langley UQ challenge is then successfully solved by using the generalized NISS method. The second development is to inject the classical line sampling to the NISS framework so as to substantially improve the efficiency of the algorithm for rare failure event analysis, and two strategies, based on different interpretations of line sampling, are developed. The first strategy is based on the hyperplane approximations, while the second-strategy is derived based on the one-dimensional integrals. Both strategies can be regarded as post-processing of the classical line sampling, while the results show that their resultant NISS estimators have different performance. The third development aims at further substantially improving the efficiency and suitability to highly nonlinear problems of line sampling, for complex structures and systems where one deterministic simulation may take hours. For doing this, the active learning strategy based on Gaussian process regression is embedded into the line sampling procedure for accurately estimating the interaction point for each sample line, with only a small number of deterministic simulations. The above three developments have largely improved the suitability and efficiency of the NISS methods, especially for real-world engineering applications. The efficiency and effectiveness of those developments are clearly interpreted with toy examples and sufficiently demonstrated by real-world test examples in system engineering, civil engineering, and mechanical engineering

    Study of the dependencies between in-service degradation and key design parameters with uncertainty for mechanical components.

    Get PDF
    The design features of machine components can impact significantly in its life while in-service, and only relatively few studies which are case specific have been undertaken with respect to this. Hence, the need for more understanding of the influence of geometric design features on the service life of a machine component. The aim of this research is to develop a methodology to assess the degradation life of a mechanical component due to geometric design influence in the presence of uncertainties and its application for the optimisation of the component in the presence of these uncertainties. This thesis has proposed a novel methodology for assessing the thermal fatigue life, a degradation mechanism based on the influence of design features in the presence of uncertainties. In this research a novel uncertainty analysis methodology that is able to handle simultaneously the presence of aleatory and epistemic uncertainties is proposed for a more realistic prediction and assessment of a components thermal fatigue degradation life estimated using finite element analysis. A design optimisation method for optimising the components design in the presence of mixed uncertainty, aleatory and epistemic uncertainties is also proposed and developed. The performance of the proposed methodology is analysed through the use of passenger vehicle brake discs. The novel uncertainty quantification methodology was initially applied on a solid brake disc, and validated for generalisability using a vented brake disc which has more complex design features. While the optimisation method as proposed was applied on the vented brake disc. With these this research proposes a validated set of uncertainty and optimisation methodology in the presence of mixed uncertainties for a design problem. The methodologies proposed in this research provide design engineers with a methodology to design components that are robust by giving the design with the least uncertainty in its output as result of design parameters inherent variability while simultaneously providing the design with the least uncertainty in estimation of its life as a result of the use of surrogate models.PhD in Manufacturin

    Efficient random set uncertainty quantification by means of advanced sampling techniques

    Get PDF
    In this dissertation, Random Sets and Advanced Sampling techniques are combined for general and efficient uncertainty quantification. Random Sets extend the traditional probabilistic framework, as they also comprise imprecision to account for scarce data, lack of knowledge, vagueness, subjectivity, etc. The general attitude of Random Sets to include different kinds of uncertainty is paid to a very high computational price. In fact, Random Sets requires a min-max convolution for each sample picked by the Monte Carlo method. The speed of the min-max convolution can be sensibly increased when the system response relationship is known in analytical form. However, in a general multidisciplinary design context, the system response is very often treated as a “black box”; thus, the convolution requires the adoption of evolutionary or stochastic algorithms, which need to be deployed for each Monte Carlo sample. Therefore, the availability of very efficient sampling techniques is paramount to allow Random Sets to be applied to engineering problems. In this dissertation, Advanced Line Sampling methods have been generalised and extended to include Random Sets. Advanced Sampling techniques make the estimation of quantiles on relevant probabilities extremely efficient, by requiring significantly fewer numbers of samples compared to standard Monte Carlo methods. In particular, the Line Sampling method has been enhanced to link well to the Random Set representation. These developments comprise line search, line selection, direction adaptation, and data buffering. The enhanced efficiency of Line Sampling is demonstrated by means of numerical and large scale finite element examples. With the enhanced algorithm, the connection between Line Sampling and the generalised uncertainty model has been possible, both in a Double Loop and in a Random Set approach. The presented computational strategies have been implemented in the open source general purpose software for uncertainty quantification, OpenCossan. The general reach of the proposed strategy is demonstrated by means of applications to structural reliability of a finite element model, to preventive maintenance, and to the NASA Langley multidisciplinary uncertainty quantification challenge

    Surrogate-Assisted Unified Optimization Framework for Investigating Marine Structural Design Under Information Uncertainty.

    Full text link
    Structural decisions made in the early stages of marine systems design can have a large impact on future acquisition, maintenance and life-cycle costs. However, owing to the unique nature of early stage marine system design, these critical structure decisions are often made on the basis of incomplete information or knowledge about the design. When coupled with design optimization analysis, the complex, uncertain early stage design environment makes it very difficult to deliver a quantified trade-off analysis for decision making. This work presents a novel decision support method that integrates design optimization, high-fidelity analysis, and modeling of information uncertainty for early stage design and analysis. To support this method this dissertation improves the design optimization methods for marine structures by proposing several novel surrogate modeling techniques and strategies. The proposed work treats the uncertainties that are sourced from limited information in a non-statistical interval uncertainty form. This interval uncertainty is treated as an objective function in an optimization framework in order to explore the impact of information uncertainty on structural design performance. In this examination, the potential structural weight penalty regarding information uncertainty can be quickly identified in early stage, avoiding costly redesign later in the design. This dissertation then continues to explore a balanced computational structure between fidelity and efficiency. A proposed novel variable fidelity approach can be applied to wisely allocate expensive high-fidelity computational simulations. In achieving the proposed capabilities for design optimization, several surrogate modeling methods are developed concerning worst-case estimation, clustered multiple meta-modeling, and mixed variable modeling techniques. These surrogate methods have been demonstrated to significantly improve the efficiency of optimizer in dealing with the challenges of early stage marine structure design.PhDNaval Architecture and Marine EngineeringUniversity of Michigan, Horace H. Rackham School of Graduate Studieshttp://deepblue.lib.umich.edu/bitstream/2027.42/133365/1/yanliuch_1.pd

    Uncertainty analysis and sensitivity analysis for multidisciplinary systems design

    Get PDF
    The objective of this research is to quantify the impact of both aleatory and epistemic uncertainties on performances of multidisciplinary systems. Aleatory uncertainty comes from the inherent uncertain nature and epistemic uncertainty comes from the lack of knowledge. Although intensive research has been conducted on aleatory uncertainty, few studies on epistemic uncertainty have been reported. In this work, the two types of uncertainty are analyzed. Aleatory uncertainty is modeled by probability distributions while epistemic uncertainty is modeled by intervals. Probabilistic analysis (PA) and interval analysis (IA) are integrated to capture the effect of the two types of uncertainty. The First Order Reliability Method is employed for PA while nonlinear optimization is used for IA. The unified uncertainty analysis, which consists of PA and IA, is employed to develop new sensitivity analysis methods for the mixture of the two types of uncertainty. The methods are able to quantify the contribution of each input variable with either epistemic uncertainty or aleatory uncertainty. The analysis results can then help better decision making on how to effectively mitigate the effect of uncertainty. The other major contribution of this research is the extension of the unified uncertainty analysis to the reliability analysis for multidisciplinary systems --Abstract, page iv
    • …
    corecore