4 research outputs found
On Nesting Monte Carlo Estimators
Many problems in machine learning and statistics involve nested expectations
and thus do not permit conventional Monte Carlo (MC) estimation. For such
problems, one must nest estimators, such that terms in an outer estimator
themselves involve calculation of a separate, nested, estimation. We
investigate the statistical implications of nesting MC estimators, including
cases of multiple levels of nesting, and establish the conditions under which
they converge. We derive corresponding rates of convergence and provide
empirical evidence that these rates are observed in practice. We further
establish a number of pitfalls that can arise from naive nesting of MC
estimators, provide guidelines about how these can be avoided, and lay out
novel methods for reformulating certain classes of nested expectation problems
into single expectations, leading to improved convergence rates. We demonstrate
the applicability of our work by using our results to develop a new estimator
for discrete Bayesian experimental design problems and derive error bounds for
a class of variational objectives.Comment: To appear at International Conference on Machine Learning 201