380 research outputs found

    Nonsmooth Optimization; Proceedings of an IIASA Workshop, March 28 - April 8, 1977

    Get PDF
    Optimization, a central methodological tool of systems analysis, is used in many of IIASA's research areas, including the Energy Systems and Food and Agriculture Programs. IIASA's activity in the field of optimization is strongly connected with nonsmooth or nondifferentiable extreme problems, which consist of searching for conditional or unconditional minima of functions that, due to their complicated internal structure, have no continuous derivatives. Particularly significant for these kinds of extreme problems in systems analysis is the strong link between nonsmooth or nondifferentiable optimization and the decomposition approach to large-scale programming. This volume contains the report of the IIASA workshop held from March 28 to April 8, 1977, entitled Nondifferentiable Optimization. However, the title was changed to Nonsmooth Optimization for publication of this volume as we are concerned not only with optimization without derivatives, but also with problems having functions for which gradients exist almost everywhere but are not continous, so that the usual gradient-based methods fail. Because of the small number of participants and the unusual length of the workshop, a substantial exchange of information was possible. As a result, details of the main developments in nonsmooth optimization are summarized in this volume, which might also be considered a guide for inexperienced users. Eight papers are presented: three on subgradient optimization, four on descent methods, and one on applicability. The report also includes a set of nonsmooth optimization test problems and a comprehensive bibliography

    A deep cut ellipsoid algorithm for convex programming

    Get PDF
    This paper proposes a deep cut version of the ellipsoid algorithm for solving a general class of continuous convex programming problems. In each step the algorithm does not require more computational effort to construct these deep cuts than its corresponding central cut version. Rules that prevent some of the numerical instabilities and theoretical drawbacks usually associated with the algorithm are also provided. Moreover, for a large class of convex programs a simple proof of its rate of convergence is given and the relation with previously known results is discussed. Finally some computational results of the deep and central cut version of the algorithm applied to a min—max stochastic queue location problem are reported.location theory;convex programming;deep cut ellipsoid algorithm;min—max programming;rate of convergence

    System and Decision Sciences at IIASA 1973-1980

    Get PDF
    This report contains a brief history of the past achievements of the System and Decision Sciences Area at IIASA, and a summary of its current and future research directions. There is a comprehensive list of the scientific staff of the Area since 1973, together with a list of their publications; abstracts of the most recent reports and biographies of the scholars working in the Area in 1980 are also included

    Accelerating Stochastic Composition Optimization

    Full text link
    Consider the stochastic composition optimization problem where the objective is a composition of two expected-value functions. We propose a new stochastic first-order method, namely the accelerated stochastic compositional proximal gradient (ASC-PG) method, which updates based on queries to the sampling oracle using two different timescales. The ASC-PG is the first proximal gradient method for the stochastic composition problem that can deal with nonsmooth regularization penalty. We show that the ASC-PG exhibits faster convergence than the best known algorithms, and that it achieves the optimal sample-error complexity in several important special cases. We further demonstrate the application of ASC-PG to reinforcement learning and conduct numerical experiments

    Stochastic Quasigradient Methods and their Implications

    Get PDF
    A number of stochastic quasigradient methods are discussed from the point of view of implementation. The discussion revolves around the interactive package of stochastic optimization routines (STO) recently developed by the Adaptation and Optimization group at IIASA. (This package is based on the stochastic and nondifferentiable optimization package (NDO) developed at the V. Glushkov Institute of Cybernetics in Kiev.) The IIASA implementation is described and its use illustrated by application to three problems which have arisen in various IIASA projects
    • …
    corecore