Inverse problems generally require a regularizer or prior for a good
solution. A recent trend is to train a convolutional net to denoise images, and
use this net as a prior when solving the inverse problem. Several proposals
depend on a singular value decomposition of the forward operator, and several
others backpropagate through the denoising net at runtime. Here we propose a
simpler approach that combines the traditional gradient-based minimization of
reconstruction error with denoising. Noise is also added at each step, so the
iterative dynamics resembles a Langevin or diffusion process. Both the level of
added noise and the size of the denoising step decay exponentially with time.
We apply our method to the problem of tomographic reconstruction from electron
micrographs acquired at multiple tilt angles. With empirical studies using
simulated tilt views, we find parameter settings for our method that produce
good results. We show that high accuracy can be achieved with as few as 50
denoising steps. We also compare with DDRM and DPS, more complex diffusion
methods of the kinds mentioned above. These methods are less accurate (as
measured by MSE and SSIM) for our tomography problem, even after the generation
hyperparameters are optimized. Finally we extend our method to reconstruction
of arbitrary-sized images and show results on 128 × 1568 pixel imagesComment: Solving inverse problems using gradient minimization coupled with a
diffusion prio