In calculating expected information gain in optimal Bayesian experimental
design, the computation of the inner loop in the classical double-loop Monte
Carlo requires a large number of samples and suffers from underflow if the
number of samples is small. These drawbacks can be avoided by using an
importance sampling approach. We present a computationally efficient method for
optimal Bayesian experimental design that introduces importance sampling based
on the Laplace method to the inner loop. We derive the optimal values for the
method parameters in which the average computational cost is minimized
according to the desired error tolerance. We use three numerical examples to
demonstrate the computational efficiency of our method compared with the
classical double-loop Monte Carlo, and a more recent single-loop Monte Carlo
method that uses the Laplace method as an approximation of the return value of
the inner loop. The first example is a scalar problem that is linear in the
uncertain parameter. The second example is a nonlinear scalar problem. The
third example deals with the optimal sensor placement for an electrical
impedance tomography experiment to recover the fiber orientation in laminate
composites.Comment: 42 pages, 35 figure