Skip to main content
Article thumbnail
Location of Repository

FAST NON-LOCAL FILTERING BY RANDOM SAMPLING: IT WORKS, ESPECIALLY FOR LARGE IMAGES

By Stanley H. Chan, Todd Zickler and Yue M. Lu

Abstract

Non-local means (NLM) is a popular denoising scheme. Conceptually simple, the algorithm is computationally intensive for large images. We propose to speed up NLM by using random sampling. Our algorithm picks, uniformly at random, a small number of columns of the weight matrix, and uses these “representatives ” to compute an approximate result. It also incorporates an extra columnnormalization of the sampled columns, a form of symmetrization that often boosts the denoising performance on real images. Using statistical large deviation theory, we analyze the proposed algorithm and provide guarantees on its performance. We show that the probability of having a large approximation error decays exponentially as the image size increases. Thus, for large images, the random estimates generated by the algorithm are tightly concentrated around their limit values, even if the sampling ratio is small. Numerical results confirm our theoretical analysis: the proposed algorithm reduces the run time of NLM, and thanks to the symmetrization step, actually provides some improvement in peak signal-to-noise ratios. Index Terms — Non-local means, random sampling, Sinkhorn-Knopp balancing scheme, image denoisin

Year: 2013
OAI identifier: oai:CiteSeerX.psu:10.1.1.352.8592
Provided by: CiteSeerX
Download PDF:
Sorry, we are unable to provide the full text but you may find it at the following location(s):
  • http://citeseerx.ist.psu.edu/v... (external link)
  • http://www.eecs.harvard.edu/~z... (external link)
  • Suggested articles


    To submit an update or takedown request for this paper, please submit an Update/Correction/Removal Request.