The efficient importance sampling (EIS) method is a general principle for the
numerical evaluation of high-dimensional integrals that uses the sequential
structure of target integrands to build variance minimising importance
samplers. Despite a number of successful applications in high dimensions, it is
well known that importance sampling strategies are subject to an exponential
growth in variance as the dimension of the integration increases. We solve this
problem by recognising that the EIS framework has an offline sequential Monte
Carlo interpretation. The particle EIS method is based on non-standard
resampling weights that take into account the look-ahead construction of the
importance sampler. We apply the method for a range of univariate and bivariate
stochastic volatility specifications. We also develop a new application of the
EIS approach to state space models with Student's t state innovations. Our
results show that the particle EIS method strongly outperforms both the
standard EIS method and particle filters for likelihood evaluation in high
dimensions. Moreover, the ratio between the variances of the particle EIS and
particle filter methods remains stable as the time series dimension increases.
We illustrate the efficiency of the method for Bayesian inference using the
particle marginal Metropolis-Hastings and importance sampling squared
algorithms