38,247 research outputs found

    Cross-entropy optimisation of importance sampling parameters for statistical model checking

    Get PDF
    Statistical model checking avoids the exponential growth of states associated with probabilistic model checking by estimating properties from multiple executions of a system and by giving results within confidence bounds. Rare properties are often very important but pose a particular challenge for simulation-based approaches, hence a key objective under these circumstances is to reduce the number and length of simulations necessary to produce a given level of confidence. Importance sampling is a well-established technique that achieves this, however to maintain the advantages of statistical model checking it is necessary to find good importance sampling distributions without considering the entire state space. Motivated by the above, we present a simple algorithm that uses the notion of cross-entropy to find the optimal parameters for an importance sampling distribution. In contrast to previous work, our algorithm uses a low dimensional vector of parameters to define this distribution and thus avoids the often intractable explicit representation of a transition matrix. We show that our parametrisation leads to a unique optimum and can produce many orders of magnitude improvement in simulation efficiency. We demonstrate the efficacy of our methodology by applying it to models from reliability engineering and biochemistry.Comment: 16 pages, 8 figures, LNCS styl

    Chance-Constrained Outage Scheduling using a Machine Learning Proxy

    Full text link
    Outage scheduling aims at defining, over a horizon of several months to years, when different components needing maintenance should be taken out of operation. Its objective is to minimize operation-cost expectation while satisfying reliability-related constraints. We propose a distributed scenario-based chance-constrained optimization formulation for this problem. To tackle tractability issues arising in large networks, we use machine learning to build a proxy for predicting outcomes of power system operation processes in this context. On the IEEE-RTS79 and IEEE-RTS96 networks, our solution obtains cheaper and more reliable plans than other candidates

    Gaussian process surrogates for failure detection: a Bayesian experimental design approach

    Full text link
    An important task of uncertainty quantification is to identify {the probability of} undesired events, in particular, system failures, caused by various sources of uncertainties. In this work we consider the construction of Gaussian {process} surrogates for failure detection and failure probability estimation. In particular, we consider the situation that the underlying computer models are extremely expensive, and in this setting, determining the sampling points in the state space is of essential importance. We formulate the problem as an optimal experimental design for Bayesian inferences of the limit state (i.e., the failure boundary) and propose an efficient numerical scheme to solve the resulting optimization problem. In particular, the proposed limit-state inference method is capable of determining multiple sampling points at a time, and thus it is well suited for problems where multiple computer simulations can be performed in parallel. The accuracy and performance of the proposed method is demonstrated by both academic and practical examples

    Asymptotic optimality of the cross-entropy method for Markov chain problems

    Get PDF
    The correspondence between the cross-entropy method and the zero-variance approximation to simulate a rare event problem in Markov chains is shown. This leads to a sufficient condition that the cross-entropy estimator is asymptotically optimal.Comment: 13 pager; 3 figure

    Optimal design of water distribution systems based on entropy and topology

    Get PDF
    A new multi-objective evolutionary optimization approach for joint topology and pipe size design of water distribution systems is presented. The algorithm proposed considers simultaneously the adequacy of flow and pressure at the demand nodes; the initial construction cost; the network topology; and a measure of hydraulic capacity reliability. The optimization procedure is based on a general measure of hydraulic performance that combines statistical entropy, network connectivity and hydraulic feasibility. The topological properties of the solutions are accounted for and arbitrary assumptions regarding the quality of infeasible solutions are not applied. In other words, both feasible and infeasible solutions participate in the evolutionary processes; solutions survive and reproduce or perish strictly according to their Pareto-optimality. Removing artificial barriers in this way frees the algorithm to evolve optimal solutions quickly. Furthermore, any redundant binary codes that result from crossover or mutation are eliminated gradually in a seamless and generic way that avoids the arbitrary loss of potentially useful genetic material and preserves the quality of the information that is transmitted from one generation to the next. The approach proposed is entirely generic: we have not introduced any additional parameters that require calibration on a case-by-case basis. Detailed and extensive results for two test problems are included that suggest the approach is highly effective. In general, the frontier-optimal solutions achieved include topologies that are fully branched, partially- and fully-looped and, for networks with multiple sources, completely separate sub-networks

    Optimal staffing under an annualized hours regime using Cross-Entropy optimization

    Get PDF
    This paper discusses staffing under annualized hours. Staffing is the selection of the most cost-efficient workforce to cover workforce demand. Annualized hours measure working time per year instead of per week, relaxing the restriction for employees to work the same number of hours every week. To solve the underlying combinatorial optimization problem this paper develops a Cross-Entropy optimization implementation that includes a penalty function and a repair function to guarantee feasible solutions. Our experimental results show Cross-Entropy optimization is efficient across a broad range of instances, where real-life sized instances are solved in seconds, which significantly outperforms an MILP formulation solved with CPLEX. In addition, the solution quality of Cross-Entropy closely approaches the optimal solutions obtained by CPLEX. Our Cross-Entropy implementation offers an outstanding method for real-time decision making, for example in response to unexpected staff illnesses, and scenario analysis

    mfEGRA: Multifidelity Efficient Global Reliability Analysis through Active Learning for Failure Boundary Location

    Full text link
    This paper develops mfEGRA, a multifidelity active learning method using data-driven adaptively refined surrogates for failure boundary location in reliability analysis. This work addresses the issue of prohibitive cost of reliability analysis using Monte Carlo sampling for expensive-to-evaluate high-fidelity models by using cheaper-to-evaluate approximations of the high-fidelity model. The method builds on the Efficient Global Reliability Analysis (EGRA) method, which is a surrogate-based method that uses adaptive sampling for refining Gaussian process surrogates for failure boundary location using a single-fidelity model. Our method introduces a two-stage adaptive sampling criterion that uses a multifidelity Gaussian process surrogate to leverage multiple information sources with different fidelities. The method combines expected feasibility criterion from EGRA with one-step lookahead information gain to refine the surrogate around the failure boundary. The computational savings from mfEGRA depends on the discrepancy between the different models, and the relative cost of evaluating the different models as compared to the high-fidelity model. We show that accurate estimation of reliability using mfEGRA leads to computational savings of \sim46% for an analytic multimodal test problem and 24% for a three-dimensional acoustic horn problem, when compared to single-fidelity EGRA. We also show the effect of using a priori drawn Monte Carlo samples in the implementation for the acoustic horn problem, where mfEGRA leads to computational savings of 45% for the three-dimensional case and 48% for a rarer event four-dimensional case as compared to single-fidelity EGRA
    corecore