3,030 research outputs found

    Tolerance analysis approach based on the classification of uncertainty (aleatory / epistemic)

    Get PDF
    Uncertainty is ubiquitous in tolerance analysis problem. This paper deals with tolerance analysis formulation, more particularly, with the uncertainty which is necessary to take into account into the foundation of this formulation. It presents: a brief view of the uncertainty classification: Aleatory uncertainty comes from the inherent uncertain nature and phenomena, and epistemic uncertainty comes from the lack of knowledge, a formulation of the tolerance analysis problem based on this classification, its development: Aleatory uncertainty is modeled by probability distributions while epistemic uncertainty is modeled by intervals; Monte Carlo simulation is employed for probabilistic analysis while nonlinear optimization is used for interval analysis.“AHTOLA” project (ANR-11- MONU-013

    Uncertainty Analysis of the Adequacy Assessment Model of a Distributed Generation System

    Full text link
    Due to the inherent aleatory uncertainties in renewable generators, the reliability/adequacy assessments of distributed generation (DG) systems have been particularly focused on the probabilistic modeling of random behaviors, given sufficient informative data. However, another type of uncertainty (epistemic uncertainty) must be accounted for in the modeling, due to incomplete knowledge of the phenomena and imprecise evaluation of the related characteristic parameters. In circumstances of few informative data, this type of uncertainty calls for alternative methods of representation, propagation, analysis and interpretation. In this study, we make a first attempt to identify, model, and jointly propagate aleatory and epistemic uncertainties in the context of DG systems modeling for adequacy assessment. Probability and possibility distributions are used to model the aleatory and epistemic uncertainties, respectively. Evidence theory is used to incorporate the two uncertainties under a single framework. Based on the plausibility and belief functions of evidence theory, the hybrid propagation approach is introduced. A demonstration is given on a DG system adapted from the IEEE 34 nodes distribution test feeder. Compared to the pure probabilistic approach, it is shown that the hybrid propagation is capable of explicitly expressing the imprecision in the knowledge on the DG parameters into the final adequacy values assessed. It also effectively captures the growth of uncertainties with higher DG penetration levels

    Commitment to “Forbidden Questions” in Quantum Phenomena Requires a Philosophical Stand

    Get PDF
    The theory of quantum mechanics, as formulated by the Copenhagen school, has been controversial since its inception. Heisenberg’s uncertainty principle asserts that certain aspects of reality are not simultaneously defined, forbidding certain questions. Recognition has recently been given to experimentalists who have asked these “forbidden questions”. Aephraim Steinberg at the University of Toronto conducted the double slit experiment using weak measurements to construct average trajectories of particles traveling through both slits. To an adherent of the Copenhagen view of reality, however, these average trajectories will constitute nothing more than a mathematical contrivance. Experiments like these will only prove fruitful if we are willing to reject quantum mechanics’ restrictive philosophical approach. This paper will isolate the controversial physical postulate of quantum mechanics (the postulate of wave collapse) and the philosophical approach that gave rise to it. This approach reflects an instrumentalist philosophy which claims that science must only account for the results of measurements, and has nothing to say about their underlying causes. Such an approach has put an epistemic moratorium on discovering the causes underlying quantum phenomena. Notable progress has been made by those who reject this moratorium. Steinberg et al. found the average particle trajectories by rejecting the idea that there is no underlying reality to our measurements. Bell, more notably, was able to discover details of quantum entanglement by using his concept of “beables” to question the built-in epistemology of quantum mechanics. Because quantum mechanics does not explicitly define wave collapse and prescribe what causes it and when it is supposed to happen, the theory cannot give explicit solutions to a certain class of experiments. This so called measurement problem is assuaged by Zurek’s theory of decoherence, which has had great success in predicting the results of recent experiments. Despite this, decoherence contains the same philosophical oversights as the original theory; it does not propose, or even address, the issue of the underlying causes for quantum phenomena. While most scientists try to steer clear of such philosophical controversies, underlying causes cannot be discovered without the conviction that it is the job of science to discover them

    Numerical treatment of imprecise random fields in non-linear solid mechanics

    Get PDF
    The quantification and propagation of mixed uncertain material parameters in the context of solid mechanical finite element simulations is studied. While aleatory uncertainties appear in terms of spatial varying parameters, i.e. random fields, the epistemic character is induced by a lack of knowledge regarding the correlation length, which is therefore described by interval values. The concept and description of the resulting imprecise random fields is introduced in detail. The challenges occurring from interval valued correlation lengths are clarified. These include mainly the stochastic dimension, which can become very high under some circumstances, as well as the comparability of different correlation length scenarios with regard to the underlying truncation error of the applied Karhunen-Loève expansion. Additionally, the computation time can increase drastically, if the straightforward and robust double loop approach is applied. Sparse stochastic collocation method and sparse polynomial chaos expansion are studied to reduce the number of required sample evaluations, i.e. the computational cost. To keep the stochastic dimension as low as possible, the random fields are described by Karhunen-Loève expansion, using a modified exponential correlation kernel, which is advantageous in terms of a fast convergence while providing an analytic solution. Still, for small correlation lengths, the investigated approaches are limited by the curse of dimensionality. Furthermore, they turn out to be not suited for non-linear material models. As a straightforward alternative, a decoupled interpolation approach is proposed, offering a practical engineering estimate. For this purpose, the uncertain quantities only need to be propagated as a random variable and deterministically in terms of the mean values. From these results, the so-called absolutely no idea probability box (ani-p-box) can be obtained, bounding the results of the interval valued correlation length being between zero and infinity. The idea is, to interpolate the result of any arbitrary correlation length within this ani-p-box, exploiting prior knowledge about the statistical behaviour of the input random field corresponding to the correlation length. The new approach is studied for one- and two-dimensional random fields. Furthermore, linear and non-linear finite element models are used in terms of linear-elastic or elasto-plastic material laws, the latter including linear hardening. It appears that the approach only works satisfyingly for sufficiently smooth responses but an improvement by considering also higher order statistics is motivated for future research.DFG/SPP 1886/NA330/12-1/E

    Distribution-free stochastic simulation methodology for model updating under hybrid uncertainties

    Get PDF
    In the real world, a significant challenge faced in the safe operation and maintenance of infrastructures is the lack of available information or data. This results in a large degree of uncertainty and the requirement for robust and efficient uncertainty quantification (UQ) tools in order to derive the most realistic estimates of the behavior of structures. While the probabilistic approach has long been utilized as an essential tool for the quantitative mathematical representation of uncertainty, a common criticism is that the approach often involves insubstantiated subjective assumptions because of the scarcity or imprecision of available information. To avoid the inclusion of subjectivity, the concepts of imprecise probabilities have been developed, and the distributional probability-box (p-box) has gained the most attention among various types of imprecise probability models since it can straightforwardly provide a clear separation between aleatory and epistemic uncertainty. This thesis concerns the realistic consideration and numerically efficient calibraiton and propagation of aleatory and epistemic uncertainties (hybrid uncertainties) based on the distributional p-box. The recent developments including the Bhattacharyya distance-based approximate Bayesian computation (ABC) and non-intrusive imprecise stochastic simulation (NISS) methods have strengthened the subjective assumption-free approach for uncertainty calibration and propagation. However, these methods based on the distributional p-box stand on the availability of the prior knowledge determining a specific distribution family for the p-box. The target of this thesis is hence to develop a distribution-free approach for the calibraiton and propagation of hybrid uncertainties, strengthening the subjective assumption-free UQ approach. To achieve the above target, this thesis presents five main developments to improve the Bhattacharyya distance-based ABC and NISS frameworks. The first development is on improving the scope of application and efficiency of the Bhattacharyya distance-based ABC. The dimension reduction procedure is proposed to evaluate the Bhattacharyya distance when the system under investigation is described by time-domain sequences. Moreover, the efficient Bayesian inference method within the Bayesian updating with structural reliability methods (BUS) framework is developed by combining BUS with the adaptive Kriging-based reliability method, namely AK-MCMC. The second development of the distribution-free stochastic model updating framework is based on the combined application of the staircase density functions and the Bhattacharyya distance. The staircase density functions can approximate a wide range of distributions arbitrarily close; hence the development achieved to perform the Bhattacharyya distance-based ABC without limiting hypotheses on the distribution families of the parameters having to be updated. The aforementioned two developments are then integrated in the third development to provide a solution to the latest edition (2019) of the NASA UQ challenge problem. The model updating tasks under very challenging condition, where prior information of aleatory parameters are extremely limited other than a common boundary, are successfully addressed based on the above distribution-free stochastic model updating framework. Moreover, the NISS approach that simplifies the high-dimensional optimization to a set of one-dimensional searching by a first-order high-dimensional model representation (HDMR) decomposition with respect to each design parameter is developed to efficiently solve the reliability-based design optimization tasks. This challenge, at the same time, elucidates the limitations of the current developments, hence the fourth development aims at addressing the limitation that the staircase density functions are designed for univariate random variables and cannot acount for the parameter dependencies. In order to calibrate the joint distribution of correlated parameters, the distribution-free stochastic model updating framework is extended by characterizing the aleatory parameters using the Gaussian copula functions having marginal distributions as the staircase density functions. This further strengthens the assumption-free approach for uncertainty calibration in which no prior information of the parameter dependencies is required. Finally, the fifth development of the distribution-free uncertainty propagation framework is based on another application of the staircase density functions to the NISS class of methods, and it is applied for efficiently solving the reliability analysis subproblem of the NASA UQ challenge 2019. The above five developments have successfully strengthened the assumption-free approach for both uncertainty calibration and propagation thanks to the nature of the staircase density functions approximating arbitrary distributions. The efficiency and effectiveness of those developments are sufficiently demonstrated upon the real-world applications including the NASA UQ challenge 2019
    • …
    corecore