4 research outputs found

    Denoising by Higher Order Statistics

    Get PDF
    A standard approach for deducing a variational denoising method is the maximum a posteriori strategy. Here, the denoising result is chosen in such a way that it maximizes the conditional density function of the reconstruction given its observed noisy version. Unfortunately, this approach does not imply that the empirical distribution of the reconstructed noise components follows the statistics of the assumed noise model. In this paper, we propose to overcome this drawback by applying an additional transformation to the random vector modeling the noise. This transformation is then incorporated into the standard denoising approach and leads to a more sophisticated data fidelity term, which forces the removed noise components to have the desired statistical properties. The good properties of our new approach are demonstrated for additive Gaussian noise by numerical examples. Our method shows to be especially well suited for data containing high frequency structures, where other denoising methods which assume a certain smoothness of the signal cannot restore the small structures

    Homogeneous Penalizers and Constraints in Convex Image Restoration

    Get PDF
    Recently convex optimization models were successfully applied for solving various problems in image analysis and restoration. In this paper, we are interested in relations between convex constrained optimization problems of the form min{Φ(x)min\{\Phi(x) subject to Ψ(x)≤τ}\Psi(x)\le\tau\} and their non-constrained, penalized counterparts min{Φ(x)+λΨ(x)}min\{\Phi(x)+\lambda\Psi(x)\}. We start with general considerations of the topic and provide a novel proof which ensures that a solution of the constrained problem with given τ\tau is also a solution of the on-constrained problem for a certain λ\lambda. Then we deal with the special setting that Ψ\Psi is a semi-norm and Φ=ϕ(Hx)\Phi=\phi(Hx), where HH is a linear, not necessarily invertible operator and ϕ\phi is essentially smooth and strictly convex. In this case we can prove via the dual problems that there exists a bijective function which maps τ\tau from a certain interval to λ\lambda such that the solutions of the constrained problem coincide with those of the non-constrained problem if and only if τ\tau and λ\lambda are in the graph of this function. We illustrate the relation between τ\tau and λ\lambda by various problems arising in image processing. In particular, we demonstrate the performance of the constrained model in restoration tasks of images corrupted by Poisson noise and in inpainting models with constrained nuclear norm. Such models can be useful if we have a priori knowledge on the image rather than on the noise level

    Homogeneous Penalizers and Constraints in Convex Image Restoration

    No full text
    Recently convex optimization models were successfully applied for solving various problems in image analysis and restoration. In this paper, we are interested in relations between convex constrained optimization problems of the form min{Φ(x)min\{\Phi(x) subject to Ψ(x)≤τ}\Psi(x)\le\tau\} and their non-constrained, penalized counterparts min{Φ(x)+λΨ(x)}min\{\Phi(x)+\lambda\Psi(x)\}. We start with general considerations of the topic and provide a novel proof which ensures that a solution of the constrained problem with given τ\tau is also a solution of the on-constrained problem for a certain λ\lambda. Then we deal with the special setting that Ψ\Psi is a semi-norm and Φ=ϕ(Hx)\Phi=\phi(Hx), where HH is a linear, not necessarily invertible operator and ϕ\phi is essentially smooth and strictly convex. In this case we can prove via the dual problems that there exists a bijective function which maps τ\tau from a certain interval to λ\lambda such that the solutions of the constrained problem coincide with those of the non-constrained problem if and only if τ\tau and λ\lambda are in the graph of this function. We illustrate the relation between τ\tau and λ\lambda by various problems arising in image processing. In particular, we demonstrate the performance of the constrained model in restoration tasks of images corrupted by Poisson noise and in inpainting models with constrained nuclear norm. Such models can be useful if we have a priori knowledge on the image rather than on the noise level

    Denoising by Higher Order Statistics

    No full text
    A standard approach for deducing a variational denoising method is the maximum a posteriori strategy. Here, the denoising result is chosen in such a way that it maximizes the conditional density function of the reconstruction given its observed noisy version. Unfortunately, this approach does not imply that the empirical distribution of the reconstructed noise components follows the statistics of the assumed noise model. In this paper, we propose to overcome this drawback by applying an additional transformation to the random vector modeling the noise. This transformation is then incorporated into the standard denoising approach and leads to a more sophisticated data fidelity term, which forces the removed noise components to have the desired statistical properties. The good properties of our new approach are demonstrated for additive Gaussian noise by numerical examples. Our method shows to be especially well suited for data containing high frequency structures, where other denoising methods which assume a certain smoothness of the signal cannot restore the small structures
    corecore