58 research outputs found

    A Feasible Method for Optimization with Orthogonality Constraints

    Get PDF
    Minimization with orthogonality constraints (e.g., X'X = I) and/or spherical constraints (e.g., ||x||_2 = 1) has wide applications in polynomial optimization, combinatorial optimization, eigenvalue problems, sparse PCA, p-harmonic flows, 1-bit compressive sensing, matrix rank minimization, etc. These problems are difficult because the constraints are not only non-convex but numerically expensive to preserve during iterations. To deal with these difficulties, we propose to use a Crank-Nicholson-like update scheme to preserve the constraints and based on it, develop curvilinear search algorithms with lower per-iteration cost compared to those based on projections and geodesics. The efficiency of the proposed algorithms is demonstrated on a variety of test problems. In particular, for the maxcut problem, it exactly solves a decomposition formulation for the SDP relaxation. For polynomial optimization, nearest correlation matrix estimation and extreme eigenvalue problems, the proposed algorithms run very fast and return solutions no worse than those from their state-of-the-art algorithms. For the quadratic assignment problem, a gap 0.842% to the best known solution on the largest problem "256c" in QAPLIB can be reached in 5 minutes on a typical laptop

    Scalable Semidefinite Programming

    Get PDF
    Semidefinite programming (SDP) is a powerful framework from convex optimization that has striking potential for data science applications. This paper develops a provably correct algorithm for solving large SDP problems by economizing on both the storage and the arithmetic costs. Numerical evidence shows that the method is effective for a range of applications, including relaxations of MaxCut, abstract phase retrieval, and quadratic assignment. Running on a laptop, the algorithm can handle SDP instances where the matrix variable has over 10¹³ entries

    Entropy Penalized Semidefinite Programming

    Full text link
    Low-rank methods for semidefinite programming (SDP) have gained a lot of interest recently, especially in machine learning applications. Their analysis often involves determinant-based or Schatten-norm penalties, which are hard to implement in practice due to high computational efforts. In this paper, we propose Entropy Penalized Semi-definite programming (EP-SDP) which provides a unified framework for a wide class of penalty functions used in practice to promote a low-rank solution. We show that EP-SDP problems admit efficient numerical algorithm having (almost) linear time complexity of the gradient iteration which makes it useful for many machine learning and optimization problems. We illustrate the practical efficiency of our approach on several combinatorial optimization and machine learning problems.Comment: 28th International Joint Conference on Artificial Intelligence, 201
    • …
    corecore