81 research outputs found

    An Accelerated Method for Minimizing a Convex Function of Two Variables

    Get PDF
    The author considers the problem of minimizing a convex function of two variables without computing the derivatives or (in the nondifferentiable case) the subgradients of the function, and suggests two algorithms for doing this. Such algorithms could form an integral part of new methods for minimizing a convex function of many variables based on the solution of a two-dimensional minimization problem at each step (rather than on line-searches, as in most existing algorithms). This is a contribution to research on nonsmooth optimization currently underway in System and Decision Sciences Program Core

    Variational Analysis Down Under Open Problem Session

    Get PDF
    © 2018, Springer Science+Business Media, LLC, part of Springer Nature. We state the problems discussed in the open problem session at Variational Analysis Down Under conference held in honour of Prof. Asen Dontchev on 19–21 February 2018 at Federation University Australia
    • …
    corecore