This paper is on Bayesian inference for parametric statistical models that
are defined by a stochastic simulator which specifies how data is generated.
Exact sampling is then possible but evaluating the likelihood function is
typically prohibitively expensive. Approximate Bayesian Computation (ABC) is a
framework to perform approximate inference in such situations. While basic ABC
algorithms are widely applicable, they are notoriously slow and much research
has focused on increasing their efficiency. Optimisation Monte Carlo (OMC) has
recently been proposed as an efficient and embarrassingly parallel method that
leverages optimisation to accelerate the inference. In this paper, we
demonstrate an important previously unrecognised failure mode of OMC: It
generates strongly overconfident approximations by collapsing regions of
similar or near-constant likelihood into a single point. We propose an
efficient, robust generalisation of OMC that corrects this. It makes fewer
assumptions, retains the main benefits of OMC, and can be performed either as
post-processing to OMC or as a stand-alone computation. We demonstrate the
effectiveness of the proposed Robust OMC on toy examples and tasks in
inverse-graphics where we perform Bayesian inference with a complex image
renderer.Comment: 8 pages + 6 page appendix; v2: made clarifications, added a second
possible algorithm implementation and its results; v3: small clarifications,
to be published in AISTATS 202