We study a nonparametric approach to Bayesian computation via feature means,
where the expectation of prior features is updated to yield expected posterior
features, based on regression from kernel or neural net features of the
observations. All quantities involved in the Bayesian update are learned from
observed data, making the method entirely model-free. The resulting algorithm
is a novel instance of a kernel Bayes' rule (KBR). Our approach is based on
importance weighting, which results in superior numerical stability to the
existing approach to KBR, which requires operator inversion. We show the
convergence of the estimator using a novel consistency analysis on the
importance weighting estimator in the infinity norm. We evaluate our KBR on
challenging synthetic benchmarks, including a filtering problem with a
state-space model involving high dimensional image observations. The proposed
method yields uniformly better empirical performance than the existing KBR, and
competitive performance with other competing methods