48,269 research outputs found

    A Picard-S Iterative Scheme for Approximating Fixed Point of Weak-Contraction Mappings

    Full text link
    We study the convergence analysis of a Picard-S iteration method for a particular class of weak-contraction mappings. Furthermore, we prove a data dependence result for fixed point of the class of weak-contraction mappings with the help of the Picard-S iteration methods

    Hybrid Deterministic-Stochastic Methods for Data Fitting

    Full text link
    Many structured data-fitting applications require the solution of an optimization problem involving a sum over a potentially large number of measurements. Incremental gradient algorithms offer inexpensive iterations by sampling a subset of the terms in the sum. These methods can make great progress initially, but often slow as they approach a solution. In contrast, full-gradient methods achieve steady convergence at the expense of evaluating the full objective and gradient on each iteration. We explore hybrid methods that exhibit the benefits of both approaches. Rate-of-convergence analysis shows that by controlling the sample size in an incremental gradient algorithm, it is possible to maintain the steady convergence rates of full-gradient methods. We detail a practical quasi-Newton implementation based on this approach. Numerical experiments illustrate its potential benefits.Comment: 26 pages. Revised proofs of Theorems 2.6 and 3.1, results unchange
    corecore