203 research outputs found
Quasinonexpansive Iterations on the Affine Hull of Orbits: From Mann's Mean Value Algorithm to Inertial Methods
Fixed point iterations play a central role in the design and the analysis of
a large number of optimization algorithms. We study a new iterative scheme in
which the update is obtained by applying a composition of quasinonexpansive
operators to a point in the affine hull of the orbit generated up to the
current iterate. This investigation unifies several algorithmic constructs,
including Mann's mean value method, inertial methods, and multi-layer
memoryless methods. It also provides a framework for the development of new
algorithms, such as those we propose for solving monotone inclusion and
minimization problems
Generalized Forward-Backward Splitting
This paper introduces the generalized forward-backward splitting algorithm
for minimizing convex functions of the form , where
has a Lipschitz-continuous gradient and the 's are simple in the sense
that their Moreau proximity operators are easy to compute. While the
forward-backward algorithm cannot deal with more than non-smooth
function, our method generalizes it to the case of arbitrary . Our method
makes an explicit use of the regularity of in the forward step, and the
proximity operators of the 's are applied in parallel in the backward
step. This allows the generalized forward backward to efficiently address an
important class of convex problems. We prove its convergence in infinite
dimension, and its robustness to errors on the computation of the proximity
operators and of the gradient of . Examples on inverse problems in imaging
demonstrate the advantage of the proposed methods in comparison to other
splitting algorithms.Comment: 24 pages, 4 figure
HIPAD - A Hybrid Interior-Point Alternating Direction algorithm for knowledge-based SVM and feature selection
We consider classification tasks in the regime of scarce labeled training
data in high dimensional feature space, where specific expert knowledge is also
available. We propose a new hybrid optimization algorithm that solves the
elastic-net support vector machine (SVM) through an alternating direction
method of multipliers in the first phase, followed by an interior-point method
for the classical SVM in the second phase. Both SVM formulations are adapted to
knowledge incorporation. Our proposed algorithm addresses the challenges of
automatic feature selection, high optimization accuracy, and algorithmic
flexibility for taking advantage of prior knowledge. We demonstrate the
effectiveness and efficiency of our algorithm and compare it with existing
methods on a collection of synthetic and real-world data.Comment: Proceedings of 8th Learning and Intelligent OptimizatioN (LION8)
Conference, 201
Convex optimization problem prototyping for image reconstruction in computed tomography with the Chambolle-Pock algorithm
The primal-dual optimization algorithm developed in Chambolle and Pock (CP),
2011 is applied to various convex optimization problems of interest in computed
tomography (CT) image reconstruction. This algorithm allows for rapid
prototyping of optimization problems for the purpose of designing iterative
image reconstruction algorithms for CT. The primal-dual algorithm is briefly
summarized in the article, and its potential for prototyping is demonstrated by
explicitly deriving CP algorithm instances for many optimization problems
relevant to CT. An example application modeling breast CT with low-intensity
X-ray illumination is presented.Comment: Resubmitted to Physics in Medicine and Biology. Text has been
modified according to referee comments, and typos in the equations have been
correcte
Accelerated Projected Gradient Method for Linear Inverse Problems with Sparsity Constraints
Regularization of ill-posed linear inverse problems via penalization
has been proposed for cases where the solution is known to be (almost) sparse.
One way to obtain the minimizer of such an penalized functional is via
an iterative soft-thresholding algorithm. We propose an alternative
implementation to -constraints, using a gradient method, with
projection on -balls. The corresponding algorithm uses again iterative
soft-thresholding, now with a variable thresholding parameter. We also propose
accelerated versions of this iterative method, using ingredients of the
(linear) steepest descent method. We prove convergence in norm for one of these
projected gradient methods, without and with acceleration.Comment: 24 pages, 5 figures. v2: added reference, some amendments, 27 page
A general iterative algorithm for monotone operators with λ-hybrid mappings in Hilbert spaces
- …