9,711 research outputs found
A Non-Local Structure Tensor Based Approach for Multicomponent Image Recovery Problems
Non-Local Total Variation (NLTV) has emerged as a useful tool in variational
methods for image recovery problems. In this paper, we extend the NLTV-based
regularization to multicomponent images by taking advantage of the Structure
Tensor (ST) resulting from the gradient of a multicomponent image. The proposed
approach allows us to penalize the non-local variations, jointly for the
different components, through various matrix norms with .
To facilitate the choice of the hyper-parameters, we adopt a constrained convex
optimization approach in which we minimize the data fidelity term subject to a
constraint involving the ST-NLTV regularization. The resulting convex
optimization problem is solved with a novel epigraphical projection method.
This formulation can be efficiently implemented thanks to the flexibility
offered by recent primal-dual proximal algorithms. Experiments are carried out
for multispectral and hyperspectral images. The results demonstrate the
interest of introducing a non-local structure tensor regularization and show
that the proposed approach leads to significant improvements in terms of
convergence speed over current state-of-the-art methods
Non-smooth Non-convex Bregman Minimization: Unification and new Algorithms
We propose a unifying algorithm for non-smooth non-convex optimization. The
algorithm approximates the objective function by a convex model function and
finds an approximate (Bregman) proximal point of the convex model. This
approximate minimizer of the model function yields a descent direction, along
which the next iterate is found. Complemented with an Armijo-like line search
strategy, we obtain a flexible algorithm for which we prove (subsequential)
convergence to a stationary point under weak assumptions on the growth of the
model function error. Special instances of the algorithm with a Euclidean
distance function are, for example, Gradient Descent, Forward--Backward
Splitting, ProxDescent, without the common requirement of a "Lipschitz
continuous gradient". In addition, we consider a broad class of Bregman
distance functions (generated by Legendre functions) replacing the Euclidean
distance. The algorithm has a wide range of applications including many linear
and non-linear inverse problems in signal/image processing and machine
learning
- …