11 research outputs found
A variational Bayesian method for inverse problems with impulsive noise
We propose a novel numerical method for solving inverse problems subject to
impulsive noises which possibly contain a large number of outliers. The
approach is of Bayesian type, and it exploits a heavy-tailed t distribution for
data noise to achieve robustness with respect to outliers. A hierarchical model
with all hyper-parameters automatically determined from the given data is
described. An algorithm of variational type by minimizing the Kullback-Leibler
divergence between the true posteriori distribution and a separable
approximation is developed. The numerical method is illustrated on several one-
and two-dimensional linear and nonlinear inverse problems arising from heat
conduction, including estimating boundary temperature, heat flux and heat
transfer coefficient. The results show its robustness to outliers and the fast
and steady convergence of the algorithm.Comment: 20 pages, to appear in J. Comput. Phy
Convergence Rates for Inverse Problems with Impulsive Noise
We study inverse problems F(f) = g with perturbed right hand side g^{obs}
corrupted by so-called impulsive noise, i.e. noise which is concentrated on a
small subset of the domain of definition of g. It is well known that
Tikhonov-type regularization with an L^1 data fidelity term yields
significantly more accurate results than Tikhonov regularization with classical
L^2 data fidelity terms for this type of noise. The purpose of this paper is to
provide a convergence analysis explaining this remarkable difference in
accuracy. Our error estimates significantly improve previous error estimates
for Tikhonov regularization with L^1-fidelity term in the case of impulsive
noise. We present numerical results which are in good agreement with the
predictions of our analysis
Convergence Rates for Exponentially Ill-Posed Inverse Problems with Impulsive Noise
This paper is concerned with exponentially ill-posed operator equations with
additive impulsive noise on the right hand side, i.e. the noise is large on a
small part of the domain and small or zero outside. It is well known that
Tikhonov regularization with an data fidelity term outperforms Tikhonov
regularization with an fidelity term in this case. This effect has
recently been explained and quantified for the case of finitely smoothing
operators. Here we extend this analysis to the case of infinitely smoothing
forward operators under standard Sobolev smoothness assumptions on the
solution, i.e. exponentially ill-posed inverse problems. It turns out that high
order polynomial rates of convergence in the size of the support of large noise
can be achieved rather than the poor logarithmic convergence rates typical for
exponentially ill-posed problems. The main tools of our analysis are Banach
spaces of analytic functions and interpolation-type inequalities for such
spaces. We discuss two examples, the (periodic) backwards heat equation and an
inverse problem in gradiometry.Comment: to appear in SIAM J. Numer. Ana
Expectation Propagation for Poisson Data
The Poisson distribution arises naturally when dealing with data involving
counts, and it has found many applications in inverse problems and imaging. In
this work, we develop an approximate Bayesian inference technique based on
expectation propagation for approximating the posterior distribution formed
from the Poisson likelihood function and a Laplace type prior distribution,
e.g., the anisotropic total variation prior. The approach iteratively yields a
Gaussian approximation, and at each iteration, it updates the Gaussian
approximation to one factor of the posterior distribution by moment matching.
We derive explicit update formulas in terms of one-dimensional integrals, and
also discuss stable and efficient quadrature rules for evaluating these
integrals. The method is showcased on two-dimensional PET images.Comment: 25 pages, to be published at Inverse Problem
Expectation Propagation for Nonlinear Inverse Problems -- with an Application to Electrical Impedance Tomography
In this paper, we study a fast approximate inference method based on
expectation propagation for exploring the posterior probability distribution
arising from the Bayesian formulation of nonlinear inverse problems. It is
capable of efficiently delivering reliable estimates of the posterior mean and
covariance, thereby providing an inverse solution together with quantified
uncertainties. Some theoretical properties of the iterative algorithm are
discussed, and the efficient implementation for an important class of problems
of projection type is described. The method is illustrated with one typical
nonlinear inverse problem, electrical impedance tomography with complete
electrode model, under sparsity constraints. Numerical results for real
experimental data are presented, and compared with that by Markov chain Monte
Carlo. The results indicate that the method is accurate and computationally
very efficient.Comment: Journal of Computational Physics, to appea
A variational Bayesian approach for inverse problems with skew-t error distributions
In this work, we develop a novel robust Bayesian approach to inverse problems with data errors following a skew-t distribution. A hierarchical Bayesian model is developed in the inverse problem setup. The Bayesian approach contains a natural mechanism for regularization in the form of a prior distribution, and a LASSO type prior distribution is used to strongly induce sparseness. We propose a variational type algorithm by minimizing the Kullback-Leibler divergence between the true posterior distribution and a separable approximation. The proposed method is illustrated on several two-dimensional linear and nonlinear inverse problems, e.g. Cauchy problem and permeability estimation problem
Expectation propagation for Poisson data
The Poisson distribution arises naturally when dealing with data involving counts, and it has found many applications in inverse problems and imaging. In this work, we develop an approximate Bayesian inference technique based on expectation propagation for approximating the posterior distribution formed from the Poisson likelihood function and a Laplace type prior distribution, e.g. the anisotropic total variation prior. The approach iteratively yields a Gaussian approximation, and at each iteration, it updates the Gaussian approximation to one factor of the posterior distribution by moment matching. We derive explicit update formulas in terms of one-dimensional integrals, and also discuss stable and efficient quadrature rules for evaluating these integrals. The method is showcased on two-dimensional PET images