326 research outputs found

    Proximal bundle method for contact shape optimization problem

    Get PDF
    From the mathematical point of view, the contact shape optimization is a problem of nonlinear optimization with a specific structure, which can be exploited in its solution. In this paper, we show how to overcome the difficulties related to the nonsmooth cost function by using the proximal bundle methods. We describe all steps of the solution, including linearization, construction of a descent direction, line search, stopping criterion, etc. To illustrate the performance of the presented algorithm, we solve a shape optimization problem associated with the discretized two-dimensional contact problem with Coulomb's friction

    A Non-Monotone Conjugate Subgradient Type Method for Minimization of Convex Functions

    Full text link
    We suggest a conjugate subgradient type method without any line-search for minimization of convex non differentiable functions. Unlike the custom methods of this class, it does not require monotone decrease of the goal function and reduces the implementation cost of each iteration essentially. At the same time, its step-size procedure takes into account behavior of the method along the iteration points. Preliminary results of computational experiments confirm efficiency of the proposed modification.Comment: 11 page

    Aggregate subgradient method for nonsmooth DC optimization

    Get PDF
    The aggregate subgradient method is developed for solving unconstrained nonsmooth difference of convex (DC) optimization problems. The proposed method shares some similarities with both the subgradient and the bundle methods. Aggregate subgradients are defined as a convex combination of subgradients computed at null steps between two serious steps. At each iteration search directions are found using only two subgradients: the aggregate subgradient and a subgradient computed at the current null step. It is proved that the proposed method converges to a critical point of the DC optimization problem and also that the number of null steps between two serious steps is finite. The new method is tested using some academic test problems and compared with several other nonsmooth DC optimization solvers. © 2020, Springer-Verlag GmbH Germany, part of Springer Nature

    Submodular relaxation for inference in Markov random fields

    Full text link
    In this paper we address the problem of finding the most probable state of a discrete Markov random field (MRF), also known as the MRF energy minimization problem. The task is known to be NP-hard in general and its practical importance motivates numerous approximate algorithms. We propose a submodular relaxation approach (SMR) based on a Lagrangian relaxation of the initial problem. Unlike the dual decomposition approach of Komodakis et al., 2011 SMR does not decompose the graph structure of the initial problem but constructs a submodular energy that is minimized within the Lagrangian relaxation. Our approach is applicable to both pairwise and high-order MRFs and allows to take into account global potentials of certain types. We study theoretical properties of the proposed approach and evaluate it experimentally.Comment: This paper is accepted for publication in IEEE Transactions on Pattern Analysis and Machine Intelligenc
    corecore