39 research outputs found
Neural Gradient Regularizer
Owing to its significant success, the prior imposed on gradient maps has
consistently been a subject of great interest in the field of image processing.
Total variation (TV), one of the most representative regularizers, is known for
its ability to capture the sparsity of gradient maps. Nonetheless, TV and its
variants often underestimate the gradient maps, leading to the weakening of
edges and details whose gradients should not be zero in the original image.
Recently, total deep variation (TDV) has been introduced, assuming the sparsity
of feature maps, which provides a flexible regularization learned from
large-scale datasets for a specific task. However, TDV requires retraining when
the image or task changes, limiting its versatility. In this paper, we propose
a neural gradient regularizer (NGR) that expresses the gradient map as the
output of a neural network. Unlike existing methods, NGR does not rely on the
sparsity assumption, thereby avoiding the underestimation of gradient maps. NGR
is applicable to various image types and different image processing tasks,
functioning in a zero-shot learning fashion, making it a versatile and
plug-and-play regularizer. Extensive experimental results demonstrate the
superior performance of NGR over state-of-the-art counterparts for a range of
different tasks, further validating its effectiveness and versatility
A Constrained Convex Optimization Approach to Hyperspectral Image Restoration with Hybrid Spatio-Spectral Regularization
We propose a new constrained optimization approach to hyperspectral (HS)
image restoration. Most existing methods restore a desirable HS image by
solving some optimization problem, which consists of a regularization term(s)
and a data-fidelity term(s). The methods have to handle a regularization
term(s) and a data-fidelity term(s) simultaneously in one objective function,
and so we need to carefully control the hyperparameter(s) that balances these
terms. However, the setting of such hyperparameters is often a troublesome task
because their suitable values depend strongly on the regularization terms
adopted and the noise intensities on a given observation. Our proposed method
is formulated as a convex optimization problem, where we utilize a novel hybrid
regularization technique named Hybrid Spatio-Spectral Total Variation (HSSTV)
and incorporate data-fidelity as hard constraints. HSSTV has a strong ability
of noise and artifact removal while avoiding oversmoothing and spectral
distortion, without combining other regularizations such as low-rank
modeling-based ones. In addition, the constraint-type data-fidelity enables us
to translate the hyperparameters that balance between regularization and
data-fidelity to the upper bounds of the degree of data-fidelity that can be
set in a much easier manner. We also develop an efficient algorithm based on
the alternating direction method of multipliers (ADMM) to efficiently solve the
optimization problem. Through comprehensive experiments, we illustrate the
advantages of the proposed method over various HS image restoration methods
including state-of-the-art ones.Comment: 20 pages, 4 tables, 10 figures, submitted to MDPI Remote Sensin