55,278 research outputs found

    Polynomial-Chaos-based Kriging

    Full text link
    Computer simulation has become the standard tool in many engineering fields for designing and optimizing systems, as well as for assessing their reliability. To cope with demanding analysis such as optimization and reliability, surrogate models (a.k.a meta-models) have been increasingly investigated in the last decade. Polynomial Chaos Expansions (PCE) and Kriging are two popular non-intrusive meta-modelling techniques. PCE surrogates the computational model with a series of orthonormal polynomials in the input variables where polynomials are chosen in coherency with the probability distributions of those input variables. On the other hand, Kriging assumes that the computer model behaves as a realization of a Gaussian random process whose parameters are estimated from the available computer runs, i.e. input vectors and response values. These two techniques have been developed more or less in parallel so far with little interaction between the researchers in the two fields. In this paper, PC-Kriging is derived as a new non-intrusive meta-modeling approach combining PCE and Kriging. A sparse set of orthonormal polynomials (PCE) approximates the global behavior of the computational model whereas Kriging manages the local variability of the model output. An adaptive algorithm similar to the least angle regression algorithm determines the optimal sparse set of polynomials. PC-Kriging is validated on various benchmark analytical functions which are easy to sample for reference results. From the numerical investigations it is concluded that PC-Kriging performs better than or at least as good as the two distinct meta-modeling techniques. A larger gain in accuracy is obtained when the experimental design has a limited size, which is an asset when dealing with demanding computational models

    A Method for the Combination of Stochastic Time Varying Load Effects

    Get PDF
    The problem of evaluating the probability that a structure becomes unsafe under a combination of loads, over a given time period, is addressed. The loads and load effects are modeled as either pulse (static problem) processes with random occurrence time, intensity and a specified shape or intermittent continuous (dynamic problem) processes which are zero mean Gaussian processes superimposed 'on a pulse process. The load coincidence method is extended to problems with both nonlinear limit states and dynamic responses, including the case of correlated dynamic responses. The technique of linearization of a nonlinear limit state commonly used in a time-invariant problem is investigated for timevarying combination problems, with emphasis on selecting the linearization point. Results are compared with other methods, namely the method based on upcrossing rate, simpler combination rules such as Square Root of Sum of Squares and Turkstra's rule. Correlated effects among dynamic loads are examined to see how results differ from correlated static loads and to demonstrate which types of load dependencies are most important, i.e., affect' the exceedance probabilities the most. Application of the load coincidence method to code development is briefly discussed.National Science Foundation Grants CME 79-18053 and CEE 82-0759

    Optimal management of bio-based energy supply chains under parametric uncertainty through a data-driven decision-support framework

    Get PDF
    This paper addresses the optimal management of a multi-objective bio-based energy supply chain network subjected to multiple sources of uncertainty. The complexity to obtain an optimal solution using traditional uncertainty management methods dramatically increases with the number of uncertain factors considered. Such a complexity produces that, if tractable, the problem is solved after a large computational effort. Therefore, in this work a data-driven decision-making framework is proposed to address this issue. Such a framework exploits machine learning techniques to efficiently approximate the optimal management decisions considering a set of uncertain parameters that continuously influence the process behavior as an input. A design of computer experiments technique is used in order to combine these parameters and produce a matrix of representative information. These data are used to optimize the deterministic multi-objective bio-based energy network problem through conventional optimization methods, leading to a detailed (but elementary) map of the optimal management decisions based on the uncertain parameters. Afterwards, the detailed data-driven relations are described/identified using an Ordinary Kriging meta-model. The result exhibits a very high accuracy of the parametric meta-models for predicting the optimal decision variables in comparison with the traditional stochastic approach. Besides, and more importantly, a dramatic reduction of the computational effort required to obtain these optimal values in response to the change of the uncertain parameters is achieved. Thus the use of the proposed data-driven decision tool promotes a time-effective optimal decision making, which represents a step forward to use data-driven strategy in large-scale/complex industrial problems.Peer ReviewedPostprint (published version

    A model checker for performance and dependability properties

    Get PDF
    Markov chains are widely used in the context of performance and reliability evaluation of systems of various nature. Model checking of such chains with respect to a given (branching) temporal logic formula has been proposed for both the discrete [8] and the continuous time setting [1], [3]. In this short paper, we describe the prototype model checker EMC2E \vdash M C^2 for discrete and continuous-time Markov chains, where properties are expressed in appropriate extensions of CTL.We illustrate the general benefits of this approach and discuss the structure of the tool

    Meta-models for structural reliability and uncertainty quantification

    Get PDF
    A meta-model (or a surrogate model) is the modern name for what was traditionally called a response surface. It is intended to mimic the behaviour of a computational model M (e.g. a finite element model in mechanics) while being inexpensive to evaluate, in contrast to the original model which may take hours or even days of computer processing time. In this paper various types of meta-models that have been used in the last decade in the context of structural reliability are reviewed. More specifically classical polynomial response surfaces, polynomial chaos expansions and kriging are addressed. It is shown how the need for error estimates and adaptivity in their construction has brought this type of approaches to a high level of efficiency. A new technique that solves the problem of the potential biasedness in the estimation of a probability of failure through the use of meta-models is finally presented.Comment: Keynote lecture Fifth Asian-Pacific Symposium on Structural Reliability and its Applications (5th APSSRA) May 2012, Singapor

    Aging concrete structures: a review of mechanics and concepts

    Get PDF
    The safe and cost-efficient management of our built infrastructure is a challenging task considering the expected service life of at least 50 years. In spite of time-dependent changes in material properties, deterioration processes and changing demand by society, the structures need to satisfy many technical requirements related to serviceability, durability, sustainability and bearing capacity. This review paper summarizes the challenges associated with the safe design and maintenance of aging concrete structures and gives an overview of some concepts and approaches that are being developed to address these challenges

    A tool for model-checking Markov chains

    Get PDF
    Markov chains are widely used in the context of the performance and reliability modeling of various systems. Model checking of such chains with respect to a given (branching) temporal logic formula has been proposed for both discrete [34, 10] and continuous time settings [7, 12]. In this paper, we describe a prototype model checker for discrete and continuous-time Markov chains, the Erlangen-Twente Markov Chain Checker EÎMC2, where properties are expressed in appropriate extensions of CTL. We illustrate the general benefits of this approach and discuss the structure of the tool. Furthermore, we report on successful applications of the tool to some examples, highlighting lessons learned during the development and application of EÎMC2
    corecore