212 research outputs found

    Nonsmooth Optimization; Proceedings of an IIASA Workshop, March 28 - April 8, 1977

    Get PDF
    Optimization, a central methodological tool of systems analysis, is used in many of IIASA's research areas, including the Energy Systems and Food and Agriculture Programs. IIASA's activity in the field of optimization is strongly connected with nonsmooth or nondifferentiable extreme problems, which consist of searching for conditional or unconditional minima of functions that, due to their complicated internal structure, have no continuous derivatives. Particularly significant for these kinds of extreme problems in systems analysis is the strong link between nonsmooth or nondifferentiable optimization and the decomposition approach to large-scale programming. This volume contains the report of the IIASA workshop held from March 28 to April 8, 1977, entitled Nondifferentiable Optimization. However, the title was changed to Nonsmooth Optimization for publication of this volume as we are concerned not only with optimization without derivatives, but also with problems having functions for which gradients exist almost everywhere but are not continous, so that the usual gradient-based methods fail. Because of the small number of participants and the unusual length of the workshop, a substantial exchange of information was possible. As a result, details of the main developments in nonsmooth optimization are summarized in this volume, which might also be considered a guide for inexperienced users. Eight papers are presented: three on subgradient optimization, four on descent methods, and one on applicability. The report also includes a set of nonsmooth optimization test problems and a comprehensive bibliography

    Adaptive Nonmonotonic Methods With Averaging of Subgradients

    Get PDF
    The numerical methods of the nondifferentiable optimization are used for solving decision analysis problems in economic, engineering, environment and agriculture. This paper is devoted Lo the adaptive nonmonotonic methods with averaging of the subgradients. The unified approach is suggested for construction of new deterministic subgradient methods, their stochastic finite-difference analogs and a posteriori estimates of accuracy of solution

    Methods of Nondifferentiable and Stochastic Optimization and Their Applications

    Get PDF
    Optimization methods are of a great practical importance in systems analysis. They allow us to find the best behavior of a system, determine the optimal structure and compute the optimal parameters of the control system etc. The development of nondifferentiable optimization, differentiable and nondifferentiable stochastic optimization allows us to state and effectively solve new complex optimization problems which were impossible to solve by classical optimization methods. The main purpose of this article is to review briefly some important applications of nondifferentiable and stochastic optimization and to characterize principal directions of research. Clearly, the interests of the author have influenced the content of this article

    Bibliography on Nondifferentiable Optimization

    Get PDF
    This is a research bibliography with all the advantages and shortcomings that this implies. The author has used it as a bibliographical data base when writing papers, and it is therefore largely a reflection of his own personal research interests. However, it is hoped that this bibliography will nevertheless be of use to others interested in nondifferentiable optimization

    Blind Source Separation with Compressively Sensed Linear Mixtures

    Full text link
    This work studies the problem of simultaneously separating and reconstructing signals from compressively sensed linear mixtures. We assume that all source signals share a common sparse representation basis. The approach combines classical Compressive Sensing (CS) theory with a linear mixing model. It allows the mixtures to be sampled independently of each other. If samples are acquired in the time domain, this means that the sensors need not be synchronized. Since Blind Source Separation (BSS) from a linear mixture is only possible up to permutation and scaling, factoring out these ambiguities leads to a minimization problem on the so-called oblique manifold. We develop a geometric conjugate subgradient method that scales to large systems for solving the problem. Numerical results demonstrate the promising performance of the proposed algorithm compared to several state of the art methods.Comment: 9 pages, 2 figure

    A descent subgradient method using Mifflin line search for nonsmooth nonconvex optimization

    Full text link
    We propose a descent subgradient algorithm for minimizing a real function, assumed to be locally Lipschitz, but not necessarily smooth or convex. To find an effective descent direction, the Goldstein subdifferential is approximated through an iterative process. The method enjoys a new two-point variant of Mifflin line search in which the subgradients are arbitrary. Thus, the line search procedure is easy to implement. Moreover, in comparison to bundle methods, the quadratic subproblems have a simple structure, and to handle nonconvexity the proposed method requires no algorithmic modification. We study the global convergence of the method and prove that any accumulation point of the generated sequence is Clarke stationary, assuming that the objective ff is weakly upper semismooth. We illustrate the efficiency and effectiveness of the proposed algorithm on a collection of academic and semi-academic test problems
    • …
    corecore