189 research outputs found

    Kernel-Based Interior-Point Methods for Cartesian \u3cem\u3eP\u3c/em\u3e*(κ)-Linear Complementarity Problems over Symmetric Cones

    Get PDF
    We present an interior point method for Cartesian P*(k)-Linear Complementarity Problems over Symmetric Cones (SCLCPs). The Cartesian P*(k)-SCLCPs have been recently introduced as the generalization of the more commonly known and more widely used monotone SCLCPs. The IPM is based on the barrier functions that are defined by a large class of univariate functions called eligible kernel function which have recently been successfully used to design new IPMs for various optimization problems. Eligible barrier (kernel) functions are used in calculating the Nesterov-Todd search directions and the default step-size which leads to a very good complexity results for the method. For some specific eligilbe kernel functions we match the best known iteration bound for the long-step methods while for the short-step methods the best iteration bound is matched for all cases

    Kernel-Based Interior-Point Methods for Cartesian P*(k)-Linear Complementarity Problems over Symmetric Cones

    Get PDF
    We present an interior point method for Cartesian P*(k)-Linear Complementarity Problems over Symmetric Cones (SCLCPs). The Cartesian P*(k)-SCLCPs have been recently introduced as the generalization of the more commonly known and more widely used monotone SCLCPs. The IPM is based on the barrier functions that are defined by a large class of univariate functions called eligible kernel function which have recently been successfully used to design new IPMs for various optimization problems. Eligible barrier (kernel) functions are used in calculating the Nesterov-Todd search directions and the default step-size which leads to a very good complexity results for the method. For some specific eligilbe kernel functions we match the best known iteration bound for the long-step methods while for the short-step methods the best iteration bound is matched for all cases

    A second order cone formulation of continuous CTA model

    Get PDF
    The final publication is available at link.springer.comIn this paper we consider a minimum distance Controlled Tabular Adjustment (CTA) model for statistical disclosure limitation (control) of tabular data. The goal of the CTA model is to find the closest safe table to some original tabular data set that contains sensitive information. The measure of closeness is usually measured using l1 or l2 norm; with each measure having its advantages and disadvantages. Recently, in [4] a regularization of the l1 -CTA using Pseudo-Huber func- tion was introduced in an attempt to combine positive characteristics of both l1 -CTA and l2 -CTA. All three models can be solved using appro- priate versions of Interior-Point Methods (IPM). It is known that IPM in general works better on well structured problems such as conic op- timization problems, thus, reformulation of these CTA models as conic optimization problem may be advantageous. We present reformulation of Pseudo-Huber-CTA, and l1 -CTA as Second-Order Cone (SOC) op- timization problems and test the validity of the approach on the small example of two-dimensional tabular data set.Peer ReviewedPostprint (author's final draft

    Full Newton-Step Interior-Point Method for Linear Complementarity Problems

    Get PDF
    In this paper we consider an Infeasible Full Newton-step Interior-Point Method (IFNS-IPM) for monotone Linear Complementarity Problems (LCP). The method does not require a strictly feasible starting point. In addition, the method avoids calculation of the step size and instead takes full Newton-steps at each iteration. Iterates are kept close to the central path by suitable choice of parameters. The algorithm is globally convergent and the iteration bound matches the best known iteration bound for these types of methods

    Conic Optimization Theory: Convexification Techniques and Numerical Algorithms

    Full text link
    Optimization is at the core of control theory and appears in several areas of this field, such as optimal control, distributed control, system identification, robust control, state estimation, model predictive control and dynamic programming. The recent advances in various topics of modern optimization have also been revamping the area of machine learning. Motivated by the crucial role of optimization theory in the design, analysis, control and operation of real-world systems, this tutorial paper offers a detailed overview of some major advances in this area, namely conic optimization and its emerging applications. First, we discuss the importance of conic optimization in different areas. Then, we explain seminal results on the design of hierarchies of convex relaxations for a wide range of nonconvex problems. Finally, we study different numerical algorithms for large-scale conic optimization problems.Comment: 18 page

    International Conference on Continuous Optimization (ICCOPT) 2019 Conference Book

    Get PDF
    The Sixth International Conference on Continuous Optimization took place on the campus of the Technical University of Berlin, August 3-8, 2019. The ICCOPT is a flagship conference of the Mathematical Optimization Society (MOS), organized every three years. ICCOPT 2019 was hosted by the Weierstrass Institute for Applied Analysis and Stochastics (WIAS) Berlin. It included a Summer School and a Conference with a series of plenary and semi-plenary talks, organized and contributed sessions, and poster sessions. This book comprises the full conference program. It contains, in particular, the scientific program in survey style as well as with all details, and information on the social program, the venue, special meetings, and more
    corecore