Skip to main content
Article thumbnail
Location of Repository

Previous Up Next Article Citations From References: 0 From Reviews: 0

By Mr (review C (c, Luc (f-nice-is) Zhigljavsky and Anatoly A. (-card-sm

Abstract

Gradient algorithms for quadratic optimization with fast convergence rates. (English summary) Comput. Optim. Appl. 50 (2011), no. 3, 597–617.1573-2894 Summary: “We propose a family of gradient algorithms for minimizing a quadratic function f(x) = (Ax, x)/2 − (x, y) in R d or a Hilbert space, with simple rules for choosing the step-size at each iteration. We show that when the step-sizes are generated by a dynamical system with ergodic distribution having the arcsine density on a subinterval of the spectrum of A, the asymptotic rate of convergence of the algorithm can approach the (tight) bound on the rate of convergence of a conjugate gradient algorithm stopped before d iterations, with d ≤ ∞ the space dimension.” References 1. Akaike, H.: On a successive transformation of probability distribution and its application to the analysis of the optimum gradient method. Ann. Inst. Stat. Math. Tokyo 11, 1–16 (1959) MR0107973 (21 #6694

Year: 2013
OAI identifier: oai:CiteSeerX.psu:10.1.1.352.3490
Provided by: CiteSeerX
Download PDF:
Sorry, we are unable to provide the full text but you may find it at the following location(s):
  • http://citeseerx.ist.psu.edu/v... (external link)
  • Suggested articles


    To submit an update or takedown request for this paper, please submit an Update/Correction/Removal Request.