10 research outputs found

    Step control for direct stochastic-programming methods

    No full text

    Minimization Algorithms Based On Supervisor and Searcher Co-operation

    No full text
    In the present work, we explore a general framework for the design of new minimization algorithms with desirable characteristics, namely, supervisor-searcher cooperation. We propose a class of algorithms within this framework and examine a gradient algorithm in the class. Global convergence is established for the deterministic case in the absence of noise and the convergence rate is studied. Both theoretical analysis and numerical tests show that the algorithm is efficient for the deterministic case. Furthermore, the fact that there is no line search procedure incorporated in the algorithm seems to strengthen its robustness so that it tackles effectively test problems with stronger stochastic noises. The numerical results for both deterministic and stochastic test problems illustrate the appealing attributes of the algorithm
    corecore