We present a novel family of deep neural architectures, named partially
exchangeable networks (PENs) that leverage probabilistic symmetries. By design,
PENs are invariant to block-switch transformations, which characterize the
partial exchangeability properties of conditionally Markovian processes.
Moreover, we show that any block-switch invariant function has a PEN-like
representation. The DeepSets architecture is a special case of PEN and we can
therefore also target fully exchangeable data. We employ PENs to learn summary
statistics in approximate Bayesian computation (ABC). When comparing PENs to
previous deep learning methods for learning summary statistics, our results are
highly competitive, both considering time series and static models. Indeed,
PENs provide more reliable posterior samples even when using less training
data.Comment: Forthcoming on the Proceedings of ICML 2019. New comparisons with
several different networks. We now use the Wasserstein distance to produce
comparisons. Code available on GitHub. 16 pages, 5 figures, 21 table