97 research outputs found

    Accelerating two projection methods via perturbations with application to Intensity-Modulated Radiation Therapy

    Full text link
    Constrained convex optimization problems arise naturally in many real-world applications. One strategy to solve them in an approximate way is to translate them into a sequence of convex feasibility problems via the recently developed level set scheme and then solve each feasibility problem using projection methods. However, if the problem is ill-conditioned, projection methods often show zigzagging behavior and therefore converge slowly. To address this issue, we exploit the bounded perturbation resilience of the projection methods and introduce two new perturbations which avoid zigzagging behavior. The first perturbation is in the spirit of kk-step methods and uses gradient information from previous iterates. The second uses the approach of surrogate constraint methods combined with relaxed, averaged projections. We apply two different projection methods in the unperturbed version, as well as the two perturbed versions, to linear feasibility problems along with nonlinear optimization problems arising from intensity-modulated radiation therapy (IMRT) treatment planning. We demonstrate that for all the considered problems the perturbations can significantly accelerate the convergence of the projection methods and hence the overall procedure of the level set scheme. For the IMRT optimization problems the perturbed projection methods found an approximate solution up to 4 times faster than the unperturbed methods while at the same time achieving objective function values which were 0.5 to 5.1% lower.Comment: Accepted for publication in Applied Mathematics & Optimizatio

    Bounded perturbation resilience of projected scaled gradient methods

    Full text link
    We investigate projected scaled gradient (PSG) methods for convex minimization problems. These methods perform a descent step along a diagonally scaled gradient direction followed by a feasibility regaining step via orthogonal projection onto the constraint set. This constitutes a generalized algorithmic structure that encompasses as special cases the gradient projection method, the projected Newton method, the projected Landweber-type methods and the generalized Expectation-Maximization (EM)-type methods. We prove the convergence of the PSG methods in the presence of bounded perturbations. This resilience to bounded perturbations is relevant to the ability to apply the recently developed superiorization methodology to PSG methods, in particular to the EM algorithm.Comment: Computational Optimization and Applications, accepted for publicatio

    Bounded perturbation resilience of extragradient-type methods and their applications

    Full text link
    In this paper we study the bounded perturbation resilience of the extragradient and the subgradient extragradient methods for solving variational inequality (VI) problem in real Hilbert spaces. This is an important property of algorithms which guarantees the convergence of the scheme under summable errors, meaning that an inexact version of the methods can also be considered. Moreover, once an algorithm is proved to be bounded perturbation resilience, superiorizion can be used, and this allows flexibility in choosing the bounded perturbations in order to obtain a superior solution, as well explained in the paper. We also discuss some inertial extragradient methods. Under mild and standard assumptions of monotonicity and Lipschitz continuity of the VI's associated mapping, convergence of the perturbed extragradient and subgradient extragradient methods is proved. In addition we show that the perturbed algorithms converges at the rate of O(1/t)O(1/t). Numerical illustrations are given to demonstrate the performances of the algorithms.Comment: Accepted for publication in The Journal of Inequalities and Applications. arXiv admin note: text overlap with arXiv:1711.01936 and text overlap with arXiv:1507.07302 by other author

    Convergence and Perturbation Resilience of Dynamic String-Averaging Projection Methods

    Full text link
    We consider the convex feasibility problem (CFP) in Hilbert space and concentrate on the study of string-averaging projection (SAP) methods for the CFP, analyzing their convergence and their perturbation resilience. In the past, SAP methods were formulated with a single predetermined set of strings and a single predetermined set of weights. Here we extend the scope of the family of SAP methods to allow iteration-index-dependent variable strings and weights and term such methods dynamic string-averaging projection (DSAP) methods. The bounded perturbation resilience of DSAP methods is relevant and important for their possible use in the framework of the recently developed superiorization heuristic methodology for constrained minimization problems.Comment: Computational Optimization and Applications, accepted for publicatio

    Superiorization and Perturbation Resilience of Algorithms: A Continuously Updated Bibliography

    Full text link
    This document presents a, (mostly) chronologically ordered, bibliography of scientific publications on the superiorization methodology and perturbation resilience of algorithms which is compiled and continuously updated by us at: http://math.haifa.ac.il/yair/bib-superiorization-censor.html. Since the beginings of this topic we try to trace the work that has been published about it since its inception. To the best of our knowledge this bibliography represents all available publications on this topic to date, and while the URL is continuously updated we will revise this document and bring it up to date on arXiv approximately once a year. Abstracts of the cited works, and some links and downloadable files of preprints or reprints are available on the above mentioned Internet page. If you know of a related scientific work in any form that should be included here kindly write to me on: [email protected] with full bibliographic details, a DOI if available, and a PDF copy of the work if possible. The Internet page was initiated on March 7, 2015, and has been last updated on March 12, 2020.Comment: Original report: June 13, 2015 contained 41 items. First revision: March 9, 2017 contained 64 items. Second revision: March 8, 2018 contained 76 items. Third revision: March 11, 2019 contains 90 items. Fourth revision: March 16, 2020 contains 112 item

    Weak and Strong Superiorization: Between Feasibility-Seeking and Minimization

    Get PDF
    We review the superiorization methodology, which can be thought of, in some cases, as lying between feasibility-seeking and constrained minimization. It is not quite trying to solve the full fledged constrained minimization problem; rather, the task is to find a feasible point which is superior (with respect to an objective function value) to one returned by a feasibility-seeking only algorithm. We distinguish between two research directions in the superiorization methodology that nourish from the same general principle: Weak superiorization and strong superiorization and clarify their nature.Comment: Revised version. Presented at the Tenth Workshop on Mathematical Modelling of Environmental and Life Sciences Problems, October 16-19, 2014, Constantza, Romania. http://www.ima.ro/workshop/tenth_workshop
    corecore