26,255 research outputs found
Multilevel Double Loop Monte Carlo and Stochastic Collocation Methods with Importance Sampling for Bayesian Optimal Experimental Design
An optimal experimental set-up maximizes the value of data for statistical
inferences and predictions. The efficiency of strategies for finding optimal
experimental set-ups is particularly important for experiments that are
time-consuming or expensive to perform. For instance, in the situation when the
experiments are modeled by Partial Differential Equations (PDEs), multilevel
methods have been proven to dramatically reduce the computational complexity of
their single-level counterparts when estimating expected values. For a setting
where PDEs can model experiments, we propose two multilevel methods for
estimating a popular design criterion known as the expected information gain in
simulation-based Bayesian optimal experimental design. The expected information
gain criterion is of a nested expectation form, and only a handful of
multilevel methods have been proposed for problems of such form. We propose a
Multilevel Double Loop Monte Carlo (MLDLMC), which is a multilevel strategy
with Double Loop Monte Carlo (DLMC), and a Multilevel Double Loop Stochastic
Collocation (MLDLSC), which performs a high-dimensional integration by
deterministic quadrature on sparse grids. For both methods, the Laplace
approximation is used for importance sampling that significantly reduces the
computational work of estimating inner expectations. The optimal values of the
method parameters are determined by minimizing the average computational work,
subject to satisfying the desired error tolerance. The computational
efficiencies of the methods are demonstrated by estimating the expected
information gain for Bayesian inference of the fiber orientation in composite
laminate materials from an electrical impedance tomography experiment. MLDLSC
performs better than MLDLMC when the regularity of the quantity of interest,
with respect to the additive noise and the unknown parameters, can be
exploited
A Tutorial on Bayesian Optimization of Expensive Cost Functions, with Application to Active User Modeling and Hierarchical Reinforcement Learning
We present a tutorial on Bayesian optimization, a method of finding the
maximum of expensive cost functions. Bayesian optimization employs the Bayesian
technique of setting a prior over the objective function and combining it with
evidence to get a posterior function. This permits a utility-based selection of
the next observation to make on the objective function, which must take into
account both exploration (sampling from areas of high uncertainty) and
exploitation (sampling areas likely to offer improvement over the current best
observation). We also present two detailed extensions of Bayesian optimization,
with experiments---active user modelling with preferences, and hierarchical
reinforcement learning---and a discussion of the pros and cons of Bayesian
optimization based on our experiences
Statistical extraction of process zones and representative subspaces in fracture of random composite
We propose to identify process zones in heterogeneous materials by tailored
statistical tools. The process zone is redefined as the part of the structure
where the random process cannot be correctly approximated in a low-dimensional
deterministic space. Such a low-dimensional space is obtained by a spectral
analysis performed on pre-computed solution samples. A greedy algorithm is
proposed to identify both process zone and low-dimensional representative
subspace for the solution in the complementary region. In addition to the
novelty of the tools proposed in this paper for the analysis of localised
phenomena, we show that the reduced space generated by the method is a valid
basis for the construction of a reduced order model.Comment: Submitted for publication in International Journal for Multiscale
Computational Engineerin
- …