75,447 research outputs found
Efficient Reconstruction of Piecewise Constant Images Using Nonsmooth Nonconvex Minimization
We consider the restoration of piecewise constant images where the number of the regions and their
values are not fixed in advance, with a good difference of piecewise constant values between neighboring regions, from noisy data obtained at the output of a linear operator (e.g., a blurring kernel or
a Radon transform). Thus we also address the generic problem of unsupervised segmentation in the
context of linear inverse problems. The segmentation and the restoration tasks are solved jointly
by minimizing an objective function (an energy) composed of a quadratic data-fidelity term and a
nonsmooth nonconvex regularization term. The pertinence of such an energy is ensured by the analytical properties of its minimizers. However, its practical interest used to be limited by the difficulty
of the computational stage which requires a nonsmooth nonconvex minimization. Indeed, the existing
methods are unsatisfactory since they (implicitly or explicitly) involve a smooth approximation
of the regularization term and often get stuck in shallow local minima. The goal of this paper is to
design a method that efficiently handles the nonsmooth nonconvex minimization. More precisely,
we propose a continuation method where one tracks the minimizers along a sequence of approximate
nonsmooth energies {Jε}, the first of which being strictly convex and the last one the original energy
to minimize. Knowing the importance of the nonsmoothness of the regularization term for the segmentation task, each Jε is nonsmooth and is expressed as the sum of an l1 regularization term and
a smooth nonconvex function. Furthermore, the local minimization of each Jε is reformulated as the
minimization of a smooth function subject to a set of linear constraints. The latter problem is solved
by the modified primal-dual interior point method, which guarantees the descent direction at each
step. Experimental results are presented and show the effectiveness and the efficiency of the proposed
method. Comparison with simulated annealing methods further shows the advantage of our method.published_or_final_versio
A Smooth Primal-Dual Optimization Framework for Nonsmooth Composite Convex Minimization
We propose a new first-order primal-dual optimization framework for a convex
optimization template with broad applications. Our optimization algorithms
feature optimal convergence guarantees under a variety of common structure
assumptions on the problem template. Our analysis relies on a novel combination
of three classic ideas applied to the primal-dual gap function: smoothing,
acceleration, and homotopy. The algorithms due to the new approach achieve the
best known convergence rate results, in particular when the template consists
of only non-smooth functions. We also outline a restart strategy for the
acceleration to significantly enhance the practical performance. We demonstrate
relations with the augmented Lagrangian method and show how to exploit the
strongly convex objectives with rigorous convergence rate guarantees. We
provide numerical evidence with two examples and illustrate that the new
methods can outperform the state-of-the-art, including Chambolle-Pock, and the
alternating direction method-of-multipliers algorithms.Comment: 35 pages, accepted for publication on SIAM J. Optimization. Tech.
Report, Oct. 2015 (last update Sept. 2016
A Primal-Dual Augmented Lagrangian
Nonlinearly constrained optimization problems can be solved by minimizing a sequence of simpler unconstrained or linearly constrained subproblems. In this paper, we discuss the formulation of subproblems in which the objective is a primal-dual generalization of the Hestenes-Powell augmented Lagrangian function. This generalization has the crucial feature that it is minimized with respect to both the primal and the dual variables simultaneously. A benefit of this approach is that the quality of the dual variables is monitored explicitly during the solution of the subproblem. Moreover, each subproblem may be regularized by imposing explicit bounds on the dual variables. Two primal-dual variants of conventional primal methods are proposed: a primal-dual bound constrained Lagrangian (pdBCL) method and a primal-dual 1 linearly constrained Lagrangian (pd1-LCL) method
- …