The Bayesian perspective on inverse problems has attracted much mathematical
attention in recent years. Particular attention has been paid to Bayesian
inverse problems (BIPs) in which the parameter to be inferred lies in an
infinite-dimensional space, a typical example being a scalar or tensor field
coupled to some observed data via an ODE or PDE. This article gives an
introduction to the framework of well-posed BIPs in infinite-dimensional
parameter spaces, as advocated by Stuart (Acta Numer. 19:451--559, 2010) and
others. This framework has the advantage of ensuring uniformly well-posed
inference problems independently of the finite-dimensional discretisation used
for numerical solution. Recently, this framework has been extended to the case
of a heavy-tailed prior measure in the family of stable distributions, such as
an infinite-dimensional Cauchy distribution, for which polynomial moments are
infinite or undefined. It is shown that analogues of the Karhunen--Lo\`eve
expansion for square-integrable random variables can be used to sample such
measures on quasi-Banach spaces. Furthermore, under weaker regularity
assumptions than those used to date, the Bayesian posterior measure is shown to
depend Lipschitz continuously in the Hellinger and total variation metrics upon
perturbations of the misfit function and observed data.Comment: To appear in the proceedings of the 88th Annual Meeting of the
International Association of Applied Mathematics and Mechanics (GAMM), Weimar
2017. This preprint differs from the final published version in pagination
and typographical detai