10,721 research outputs found

    Accelerated graph-based nonlinear denoising filters

    Get PDF
    Denoising filters, such as bilateral, guided, and total variation filters, applied to images on general graphs may require repeated application if noise is not small enough. We formulate two acceleration techniques of the resulted iterations: conjugate gradient method and Nesterov's acceleration. We numerically show efficiency of the accelerated nonlinear filters for image denoising and demonstrate 2-12 times speed-up, i.e., the acceleration techniques reduce the number of iterations required to reach a given peak signal-to-noise ratio (PSNR) by the above indicated factor of 2-12.Comment: 10 pages, 6 figures, to appear in Procedia Computer Science, vol.80, 2016, International Conference on Computational Science, San Diego, CA, USA, June 6-8, 201

    Plug-and-Play Methods Provably Converge with Properly Trained Denoisers

    Full text link
    Plug-and-play (PnP) is a non-convex framework that integrates modern denoising priors, such as BM3D or deep learning-based denoisers, into ADMM or other proximal algorithms. An advantage of PnP is that one can use pre-trained denoisers when there is not sufficient data for end-to-end training. Although PnP has been recently studied extensively with great empirical success, theoretical analysis addressing even the most basic question of convergence has been insufficient. In this paper, we theoretically establish convergence of PnP-FBS and PnP-ADMM, without using diminishing stepsizes, under a certain Lipschitz condition on the denoisers. We then propose real spectral normalization, a technique for training deep learning-based denoisers to satisfy the proposed Lipschitz condition. Finally, we present experimental results validating the theory.Comment: Published in the International Conference on Machine Learning, 201
    corecore