2,829 research outputs found
From Rank Estimation to Rank Approximation: Rank Residual Constraint for Image Restoration
In this paper, we propose a novel approach to the rank minimization problem,
termed rank residual constraint (RRC) model. Different from existing low-rank
based approaches, such as the well-known nuclear norm minimization (NNM) and
the weighted nuclear norm minimization (WNNM), which estimate the underlying
low-rank matrix directly from the corrupted observations, we progressively
approximate the underlying low-rank matrix via minimizing the rank residual.
Through integrating the image nonlocal self-similarity (NSS) prior with the
proposed RRC model, we apply it to image restoration tasks, including image
denoising and image compression artifacts reduction. Towards this end, we first
obtain a good reference of the original image groups by using the image NSS
prior, and then the rank residual of the image groups between this reference
and the degraded image is minimized to achieve a better estimate to the desired
image. In this manner, both the reference and the estimated image are updated
gradually and jointly in each iteration. Based on the group-based sparse
representation model, we further provide a theoretical analysis on the
feasibility of the proposed RRC model. Experimental results demonstrate that
the proposed RRC model outperforms many state-of-the-art schemes in both the
objective and perceptual quality
Recommended from our members
Sparse Recovery and Representation Learning
This dissertation focuses on sparse representation and dictionary learning, with three relative topics. First, in chapter 1, we study the problem of low-rank matrix recovery in the presence of prior information. We first study the recovery of low-rank matrices with a necessary and sufficient condition, called the Null Space Property, for exact recovery from compressively sampled measurements using nuclear norm minimization. Here, we provide an alternative theoretical analysis of the bound on the number of random Gaussian measurements needed for the condition to be satisfied with high probability. We then study low-rank matrix recovery when prior information is available. We analyze an existing algorithm, provide the necessary and sufficient conditions for exact recovery and show that the existing algorithm is limited in certain cases. We provide an alternative recovery algorithm to deal with the drawback and provide sufficient recovery conditions based on that. In chapter 2, we study the problem of learning a sparsifying dictionary of a set of data, focusing on learning dictionaries that admit fast transforms. Inspired by the Fast Fourier Transform, we propose a learning algorithm involving unknown parameters for a linear transformation matrix. Empirically, our algorithm can produce dictionaries that provide lower numerical sparsity for the sparse representation of images than the Discrete Fourier Transformation (DFT). Additionally, due to its structure, the learned dictionary can recover the original signal from the sparse representation in computations. In chapter 3, we study the representation learning problem in a more complex setting. We use the concept of dictionary learning and apply it in a deep generative model. Motivated by an application in the computer gaming industry where designers needs to have an urban layout generation tool that allows fast generation and modification, we present a novel solution to synthesize high quality building placements using conditional generative latent optimization together with adversarial training. The capability of the proposed method is demonstrated in various examples. The inference is nearly in real time, thus it can assist designers to iterate their designs of virtual cities quickly
- …