We present Surjective Sequential Neural Likelihood (SSNL) estimation, a novel
method for simulation-based inference in models where the evaluation of the
likelihood function is not tractable and only a simulator that can generate
synthetic data is available. SSNL fits a dimensionality-reducing surjective
normalizing flow model and uses it as a surrogate likelihood function which
allows for conventional Bayesian inference using either Markov chain Monte
Carlo methods or variational inference. By embedding the data in a
low-dimensional space, SSNL solves several issues previous likelihood-based
methods had when applied to high-dimensional data sets that, for instance,
contain non-informative data dimensions or lie along a lower-dimensional
manifold. We evaluate SSNL on a wide variety of experiments and show that it
generally outperforms contemporary methods used in simulation-based inference,
for instance, on a challenging real-world example from astrophysics which
models the magnetic field strength of the sun using a solar dynamo model