14 research outputs found

    Studying the rate of convergence of gradient optimisation algorithms via the theory of optimal experimental design

    Get PDF
    The most common class of methods for solving quadratic optimisation problems is the class of gradient algorithms, the most famous of which being the Steepest Descent algorithm. The development of a particular gradient algorithm, the Barzilai-Borwein algorithm, has sparked a lot of research in the area in recent years and many algorithms now exist which have faster rates of convergence than that possessed by the Steepest Descent algorithm. The technology to effectively analyse and compare the asymptotic rates of convergence of gradient algorithms is, however, limited and so it is somewhat unclear from literature as to which algorithms possess the faster rates of convergence. In this thesis methodology is developed to enable better analysis of the asymptotic rates of convergence of gradient algorithms applied to quadratic optimisation problems. This methodology stems from a link with the theory of optimal experimental design. It is established that gradient algorithms can be related to algorithms for constructing optimal experimental designs for linear regression models. Furthermore, the asymptotic rates of convergence of these gradient algorithms can be expressed through the asymptotic behaviour of multiplicative algorithms for constructing optimal experimental designs. The described connection to optimal experimental design has also been used to influence the creation of several new gradient algorithms which would not have otherwise been intuitively thought of. The asymptotic rates of convergence of these algorithms are studied extensively and insight is given as to how some gradient algorithms are able to converge faster than others. It is demonstrated that the worst rates are obtained when the corresponding multiplicative procedure for updating the designs converges to the optimal design. Simulations reveal that the asymptotic rates of convergence of some of these new algorithms compare favourably with those of existing gradient-type algorithms such as the Barzilai-Borwein algorithm

    Optimal experimental design and quadratic optimisation

    No full text
    A well known gradient-type algorithm for solving quadratic opti- mization problems is the method of Steepest Descent. Here the Steepest Descent algorithm is generalized to a broader family of gradient algorithms, where the step-length Îłk is chosen in accordance with a particular procedure. The asymp- totic rate of convergence of this family is studied. To facilitate the investigation we re-write the algorithms in a normalized form which enables us to exploit a link with the theory of optimum experimental design

    The Pool

    No full text
    corecore