699 research outputs found
Convergence Rates for Exponentially Ill-Posed Inverse Problems with Impulsive Noise
This paper is concerned with exponentially ill-posed operator equations with
additive impulsive noise on the right hand side, i.e. the noise is large on a
small part of the domain and small or zero outside. It is well known that
Tikhonov regularization with an data fidelity term outperforms Tikhonov
regularization with an fidelity term in this case. This effect has
recently been explained and quantified for the case of finitely smoothing
operators. Here we extend this analysis to the case of infinitely smoothing
forward operators under standard Sobolev smoothness assumptions on the
solution, i.e. exponentially ill-posed inverse problems. It turns out that high
order polynomial rates of convergence in the size of the support of large noise
can be achieved rather than the poor logarithmic convergence rates typical for
exponentially ill-posed problems. The main tools of our analysis are Banach
spaces of analytic functions and interpolation-type inequalities for such
spaces. We discuss two examples, the (periodic) backwards heat equation and an
inverse problem in gradiometry.Comment: to appear in SIAM J. Numer. Ana
Sparse Signal Inversion with Impulsive Noise by Dual Spectral Projected Gradient Method
We consider sparse signal inversion with impulsive noise. There are three major ingredients. The first is regularizing properties; we discuss convergence rate of regularized solutions. The second is devoted to the numerical solutions. It is challenging due to the fact that both fidelity and regularization term lack differentiability. Moreover, for ill-conditioned problems, sparsity regularization is often unstable. We propose a novel dual spectral projected gradient (DSPG) method which combines the dual problem of multiparameter regularization with spectral projection gradient method to solve the nonsmooth l1+l1 optimization functional. We show that one can overcome the nondifferentiability and instability by adding a smooth l2 regularization term to the original optimization functional. The advantage of the proposed functional is that its convex duality reduced to a constraint smooth functional. Moreover, it is stable even for ill-conditioned problems. Spectral projected gradient algorithm is used to compute the minimizers and we prove the convergence. The third is numerical simulation. Some experiments are performed, using compressed sensing and image inpainting, to demonstrate the efficiency of the proposed approach
Computational Inverse Problems
Inverse problem typically deal with the identification of unknown quantities from indirect measurements and appear in many areas in technology, medicine, biology, finance, and econometrics. The computational solution of such problems is a very active, interdisciplinary field with close connections to optimization, control theory, differential equations, asymptotic analysis, statistics, and probability. The focus of this workshop was on hybrid methods, model reduction, regularization in Banach spaces, and statistical approaches
- …