2,045 research outputs found

    APPROXIMATION OF LIMIT STATE SURFACES IN MONOTONIC MONTE CARLO SETTINGS

    Get PDF
    International audienceThis article investigates the theoretical convergence properties of the estimators produced by a numerical exploration of a monotonic function with multivariate random inputs in a structural reliability framework.The quantity to be estimated is a probability typically associated to an undesirable (unsafe) event and the function is usually implemented as a computer model. The estimators produced by a Monte Carlo numerical design are two subsets of inputs leading to safe and unsafe situations, the measures of which can be traduced as deterministic bounds for the probability. Several situations are considered, when the design is independent, identically distributed or not, or sequential. As a major consequence, a consistent estimator of the (limit state) surface separating the subsets under isotonicity and regularity arguments can be built, and its convergence speed can be exhibited. This estimator is built by aggregating semi-supervized binary classifiers chosen as constrained Support Vector Machines. Numerical experiments conducted on toy examples highlight that they work faster than recently developed monotonic neural networks with comparable predictable power. They are therefore more adapted when the computational time is a key issue

    Failure Probability Estimation and Detection of Failure Surfaces via Adaptive Sequential Decomposition of the Design Domain

    Full text link
    We propose an algorithm for an optimal adaptive selection of points from the design domain of input random variables that are needed for an accurate estimation of failure probability and the determination of the boundary between safe and failure domains. The method is particularly useful when each evaluation of the performance function g(x) is very expensive and the function can be characterized as either highly nonlinear, noisy, or even discrete-state (e.g., binary). In such cases, only a limited number of calls is feasible, and gradients of g(x) cannot be used. The input design domain is progressively segmented by expanding and adaptively refining mesh-like lock-free geometrical structure. The proposed triangulation-based approach effectively combines the features of simulation and approximation methods. The algorithm performs two independent tasks: (i) the estimation of probabilities through an ingenious combination of deterministic cubature rules and the application of the divergence theorem and (ii) the sequential extension of the experimental design with new points. The sequential selection of points from the design domain for future evaluation of g(x) is carried out through a new learning function, which maximizes instantaneous information gain in terms of the probability classification that corresponds to the local region. The extension may be halted at any time, e.g., when sufficiently accurate estimations are obtained. Due to the use of the exact geometric representation in the input domain, the algorithm is most effective for problems of a low dimension, not exceeding eight. The method can handle random vectors with correlated non-Gaussian marginals. The estimation accuracy can be improved by employing a smooth surrogate model. Finally, we define new factors of global sensitivity to failure based on the entire failure surface weighted by the density of the input random vector.Comment: 42 pages, 24 figure

    Active Mean Fields for Probabilistic Image Segmentation: Connections with Chan-Vese and Rudin-Osher-Fatemi Models

    Get PDF
    Segmentation is a fundamental task for extracting semantically meaningful regions from an image. The goal of segmentation algorithms is to accurately assign object labels to each image location. However, image-noise, shortcomings of algorithms, and image ambiguities cause uncertainty in label assignment. Estimating the uncertainty in label assignment is important in multiple application domains, such as segmenting tumors from medical images for radiation treatment planning. One way to estimate these uncertainties is through the computation of posteriors of Bayesian models, which is computationally prohibitive for many practical applications. On the other hand, most computationally efficient methods fail to estimate label uncertainty. We therefore propose in this paper the Active Mean Fields (AMF) approach, a technique based on Bayesian modeling that uses a mean-field approximation to efficiently compute a segmentation and its corresponding uncertainty. Based on a variational formulation, the resulting convex model combines any label-likelihood measure with a prior on the length of the segmentation boundary. A specific implementation of that model is the Chan-Vese segmentation model (CV), in which the binary segmentation task is defined by a Gaussian likelihood and a prior regularizing the length of the segmentation boundary. Furthermore, the Euler-Lagrange equations derived from the AMF model are equivalent to those of the popular Rudin-Osher-Fatemi (ROF) model for image denoising. Solutions to the AMF model can thus be implemented by directly utilizing highly-efficient ROF solvers on log-likelihood ratio fields. We qualitatively assess the approach on synthetic data as well as on real natural and medical images. For a quantitative evaluation, we apply our approach to the icgbench dataset

    Bayesian Modelling Approaches for Quantum States -- The Ultimate Gaussian Process States Handbook

    Full text link
    Capturing the correlation emerging between constituents of many-body systems accurately is one of the key challenges for the appropriate description of various systems whose properties are underpinned by quantum mechanical fundamentals. This thesis discusses novel tools and techniques for the (classical) modelling of quantum many-body wavefunctions with the ultimate goal to introduce a universal framework for finding accurate representations from which system properties can be extracted efficiently. It is outlined how synergies with standard machine learning approaches can be exploited to enable an automated inference of the most relevant intrinsic characteristics through rigorous Bayesian regression techniques. Based on the probabilistic framework forming the foundation of the introduced ansatz, coined the Gaussian Process State, different compression techniques are explored to extract numerically feasible representations of relevant target states within stochastic schemes. By following intuitively motivated design principles, the resulting model carries a high degree of interpretability and offers an easily applicable tool for the numerical study of quantum systems, including ones which are notoriously difficult to simulate due to a strong intrinsic correlation. The practical applicability of the Gaussian Process States framework is demonstrated within several benchmark applications, in particular, ground state approximations for prototypical quantum lattice models, Fermi-Hubbard models and J1J2J_1-J_2 models, as well as simple ab-initio quantum chemical systems.Comment: PhD Thesis, King's College London, 202 page
    corecore