203 research outputs found

    Quasinonexpansive Iterations on the Affine Hull of Orbits: From Mann's Mean Value Algorithm to Inertial Methods

    Full text link
    Fixed point iterations play a central role in the design and the analysis of a large number of optimization algorithms. We study a new iterative scheme in which the update is obtained by applying a composition of quasinonexpansive operators to a point in the affine hull of the orbit generated up to the current iterate. This investigation unifies several algorithmic constructs, including Mann's mean value method, inertial methods, and multi-layer memoryless methods. It also provides a framework for the development of new algorithms, such as those we propose for solving monotone inclusion and minimization problems

    Generalized Forward-Backward Splitting

    Full text link
    This paper introduces the generalized forward-backward splitting algorithm for minimizing convex functions of the form F+∑i=1nGiF + \sum_{i=1}^n G_i, where FF has a Lipschitz-continuous gradient and the GiG_i's are simple in the sense that their Moreau proximity operators are easy to compute. While the forward-backward algorithm cannot deal with more than n=1n = 1 non-smooth function, our method generalizes it to the case of arbitrary nn. Our method makes an explicit use of the regularity of FF in the forward step, and the proximity operators of the GiG_i's are applied in parallel in the backward step. This allows the generalized forward backward to efficiently address an important class of convex problems. We prove its convergence in infinite dimension, and its robustness to errors on the computation of the proximity operators and of the gradient of FF. Examples on inverse problems in imaging demonstrate the advantage of the proposed methods in comparison to other splitting algorithms.Comment: 24 pages, 4 figure

    HIPAD - A Hybrid Interior-Point Alternating Direction algorithm for knowledge-based SVM and feature selection

    Full text link
    We consider classification tasks in the regime of scarce labeled training data in high dimensional feature space, where specific expert knowledge is also available. We propose a new hybrid optimization algorithm that solves the elastic-net support vector machine (SVM) through an alternating direction method of multipliers in the first phase, followed by an interior-point method for the classical SVM in the second phase. Both SVM formulations are adapted to knowledge incorporation. Our proposed algorithm addresses the challenges of automatic feature selection, high optimization accuracy, and algorithmic flexibility for taking advantage of prior knowledge. We demonstrate the effectiveness and efficiency of our algorithm and compare it with existing methods on a collection of synthetic and real-world data.Comment: Proceedings of 8th Learning and Intelligent OptimizatioN (LION8) Conference, 201

    Convex optimization problem prototyping for image reconstruction in computed tomography with the Chambolle-Pock algorithm

    Get PDF
    The primal-dual optimization algorithm developed in Chambolle and Pock (CP), 2011 is applied to various convex optimization problems of interest in computed tomography (CT) image reconstruction. This algorithm allows for rapid prototyping of optimization problems for the purpose of designing iterative image reconstruction algorithms for CT. The primal-dual algorithm is briefly summarized in the article, and its potential for prototyping is demonstrated by explicitly deriving CP algorithm instances for many optimization problems relevant to CT. An example application modeling breast CT with low-intensity X-ray illumination is presented.Comment: Resubmitted to Physics in Medicine and Biology. Text has been modified according to referee comments, and typos in the equations have been correcte

    Accelerated Projected Gradient Method for Linear Inverse Problems with Sparsity Constraints

    Full text link
    Regularization of ill-posed linear inverse problems via â„“1\ell_1 penalization has been proposed for cases where the solution is known to be (almost) sparse. One way to obtain the minimizer of such an â„“1\ell_1 penalized functional is via an iterative soft-thresholding algorithm. We propose an alternative implementation to â„“1\ell_1-constraints, using a gradient method, with projection on â„“1\ell_1-balls. The corresponding algorithm uses again iterative soft-thresholding, now with a variable thresholding parameter. We also propose accelerated versions of this iterative method, using ingredients of the (linear) steepest descent method. We prove convergence in norm for one of these projected gradient methods, without and with acceleration.Comment: 24 pages, 5 figures. v2: added reference, some amendments, 27 page
    • …
    corecore