15,148 research outputs found

    Meta-models for structural reliability and uncertainty quantification

    Get PDF
    A meta-model (or a surrogate model) is the modern name for what was traditionally called a response surface. It is intended to mimic the behaviour of a computational model M (e.g. a finite element model in mechanics) while being inexpensive to evaluate, in contrast to the original model which may take hours or even days of computer processing time. In this paper various types of meta-models that have been used in the last decade in the context of structural reliability are reviewed. More specifically classical polynomial response surfaces, polynomial chaos expansions and kriging are addressed. It is shown how the need for error estimates and adaptivity in their construction has brought this type of approaches to a high level of efficiency. A new technique that solves the problem of the potential biasedness in the estimation of a probability of failure through the use of meta-models is finally presented.Comment: Keynote lecture Fifth Asian-Pacific Symposium on Structural Reliability and its Applications (5th APSSRA) May 2012, Singapor

    Sequential design of computer experiments for the estimation of a probability of failure

    Full text link
    This paper deals with the problem of estimating the volume of the excursion set of a function f:Rd→Rf:\mathbb{R}^d \to \mathbb{R} above a given threshold, under a probability measure on Rd\mathbb{R}^d that is assumed to be known. In the industrial world, this corresponds to the problem of estimating a probability of failure of a system. When only an expensive-to-simulate model of the system is available, the budget for simulations is usually severely limited and therefore classical Monte Carlo methods ought to be avoided. One of the main contributions of this article is to derive SUR (stepwise uncertainty reduction) strategies from a Bayesian-theoretic formulation of the problem of estimating a probability of failure. These sequential strategies use a Gaussian process model of ff and aim at performing evaluations of ff as efficiently as possible to infer the value of the probability of failure. We compare these strategies to other strategies also based on a Gaussian process model for estimating a probability of failure.Comment: This is an author-generated postprint version. The published version is available at http://www.springerlink.co

    Simultaneous Extrema in the Entropy Production for Steady-State Fluid Flow in Parallel Pipes

    Full text link
    Steady-state flow of an incompressible fluid in parallel pipes can simultaneously satisfy two contradictory extremum principles in the entropy production, depending on the flow conditions. For a constant total flow rate, the flow can satisfy (i) a pipe network minimum entropy production (MinEP) principle with respect to the flow rates, and (ii) the maximum entropy production (MaxEP) principle of Ziegler and Paltridge with respect to the choice of flow regime. The first principle - different to but allied to that of Prigogine - arises from the stability of the steady state compared to non-steady-state flows; it is proven for isothermal laminar and turbulent flows in parallel pipes with a constant power law exponent, but is otherwise invalid. The second principle appears to be more fundamental, driving the formation of turbulent flow in single and parallel pipes at higher Reynolds numbers. For constant head conditions, the flow can satisfy (i) a modified maximum entropy production (MaxEPMod) principle of \v{Z}upanovi\'c and co-workers with respect to the flow rates, and (ii) an inversion of the Ziegler-Paltridge MaxEP principle with respect to the flow regime. The interplay between these principles is demonstrated by examples.Comment: Revised version 2; 5 figure

    Empirical models, rules, and optimization

    Get PDF
    This paper considers supply decisions by firms in a dynamic setting with adjustment costs and compares the behavior of an optimal control model to that of a rule-based system which relaxes the assumption that agents are explicit optimizers. In our approach, the economic agent uses believably simple rules in coping with complex situations. We estimate rules using an artificially generated sample obtained by running repeated simulations of a dynamic optimal control model of a firm's hiring/firing decisions. We show that (i) agents using heuristics can behave as if they were seeking rationally to maximize their dynamic returns; (ii) the approach requires fewer behavioral assumptions relative to dynamic optimization and the assumptions made are based on economically intuitive theoretical results linking rule adoption to uncertainty; (iii) the approach delineates the domain of applicability of maximization hypotheses and describes the behavior of agents in situations of economic disequilibrium. The approach adopted uses concepts from fuzzy control theory. An agent, instead of optimizing, follows Fuzzy Associative Memory (FAM) rules which, given input and output data, can be estimated and used to approximate any non-linear dynamic process. Empirical results indicate that the fuzzy rule-based system performs extremely well in approximating optimal dynamic behavior in situations with limited noise.Decision-making. ,econometric models ,TMD ,
    • …
    corecore