6 research outputs found

    A relaxed interior point method for low-rank semidefinite programming problems with applications to matrix completion

    Get PDF
    A new relaxed variant of interior point method for low-rank semidefinite programming problems is proposed in this paper. The method is a step outside of the usual interior point framework. In anticipation to converging to a low-rank primal solution, a special nearly low-rank form of all primal iterates is imposed. To accommodate such a (restrictive) structure, the first order optimality conditions have to be relaxed and are therefore approximated by solving an auxiliary least-squares problem. The relaxed interior point framework opens numerous possibilities how primal and dual approximated Newton directions can be computed. In particular, it admits the application of both the first- and the second-order methods in this context. The convergence of the method is established. A prototype implementation is discussed and encouraging preliminary computational results are reported for solving the SDP-reformulation of matrix-completion problems

    An integer programming approach for the satisfiability problems.

    Get PDF
    by Lui Oi Lun Irene.Thesis (M.Phil.)--Chinese University of Hong Kong, 2001.Includes bibliographical references (leaves 128-132).Abstracts in English and Chinese.List of Figures --- p.viiList of Tables --- p.viiiChapter 1 --- Introduction --- p.1Chapter 1.1 --- Satisfiability Problem --- p.1Chapter 1.2 --- Motivation of the Research --- p.1Chapter 1.3 --- Overview of the Thesis --- p.2Chapter 2 --- Constraint Satisfaction Problem and Satisfiability Problem --- p.4Chapter 2.1 --- Constraint Programming --- p.4Chapter 2.2 --- Satisfiability Problem --- p.6Chapter 2.3 --- Methods in Solving SAT problem --- p.7Chapter 2.3.1 --- Davis-Putnam-Loveland Procedure --- p.7Chapter 2.3.2 --- SATZ by Chu-Min Li --- p.8Chapter 2.3.3 --- Local Search for SAT --- p.11Chapter 2.3.4 --- Integer Linear Programming Method for SAT --- p.12Chapter 2.3.5 --- Semidefinite Programming Method --- p.13Chapter 2.4 --- Softwares for SAT --- p.15Chapter 2.4.1 --- SAT01 --- p.15Chapter 2.4.2 --- "SATZ and SATZ213, contributed by Chu-Min Li" --- p.15Chapter 2.4.3 --- Others --- p.15Chapter 3 --- Integer Programming --- p.17Chapter 3.1 --- Introduction --- p.17Chapter 3.1.1 --- Formulation of IPs and BIPs --- p.18Chapter 3.1.2 --- Binary Search Tree --- p.19Chapter 3.2 --- Methods in Solving IP problem --- p.19Chapter 3.2.1 --- Branch-and-Bound Method --- p.20Chapter 3.2.2 --- Cutting-Plane Method --- p.23Chapter 3.2.3 --- Duality in Integer Programming --- p.26Chapter 3.2.4 --- Heuristic Algorithm --- p.28Chapter 3.3 --- Zero-one Optimization and Continuous Relaxation --- p.29Chapter 3.3.1 --- Introduction --- p.29Chapter 3.3.2 --- The Roof Dual expressed in terms of Lagrangian Relaxation --- p.30Chapter 3.3.3 --- Determining the Existence of a Duality Gap --- p.31Chapter 3.4 --- Software for solving Integer Programs --- p.33Chapter 4 --- Integer Programming Formulation for SAT Problem --- p.35Chapter 4.1 --- From 3-CNF SAT Clauses to Zero-One IP Constraints --- p.35Chapter 4.2 --- From m-Constrained IP Problem to Singly-Constrained IP Problem --- p.38Chapter 4.2.1 --- Example --- p.39Chapter 5 --- A Basic Branch-and-Bound Algorithm for the Zero-One Polynomial Maximization Problem --- p.42Chapter 5.1 --- Reason for choosing Branch-and-Bound Method --- p.42Chapter 5.2 --- Searching Algorithm --- p.43Chapter 5.2.1 --- Branch Rule --- p.44Chapter 5.2.2 --- Bounding Rule --- p.46Chapter 5.2.3 --- Fathoming Test --- p.46Chapter 5.2.4 --- Example --- p.47Chapter 6 --- Revised Bound Rule for Branch-and-Bound Algorithm --- p.55Chapter 6.1 --- Revised Bound Rule --- p.55Chapter 6.1.1 --- CPLEX --- p.57Chapter 6.2 --- Example --- p.57Chapter 6.3 --- Conclusion --- p.65Chapter 7 --- Revised Branch Rule for Branch-and-Bound Algorithm --- p.67Chapter 7.1 --- Revised Branch Rule --- p.67Chapter 7.2 --- Comparison between Branch Rule and Revised Branch Rule --- p.69Chapter 7.3 --- Example --- p.72Chapter 7.4 --- Conclusion --- p.73Chapter 8 --- Experimental Results and Analysis --- p.80Chapter 8.1 --- Experimental Results --- p.80Chapter 8.2 --- Statistical Analysis --- p.33Chapter 8.2.1 --- Analysis of Search Techniques --- p.83Chapter 8.2.2 --- Discussion of the Performance of SATZ --- p.85Chapter 9 --- Concluding Remarks --- p.87Chapter 9.1 --- Conclusion --- p.87Chapter 9.2 --- Suggestions for Future Research --- p.88Chapter A --- Searching Procedures for Solving Constraint Satisfaction Problem (CSP) --- p.91Chapter A.1 --- Notation --- p.91Chapter A.2 --- Procedures for Solving CSP --- p.92Chapter A.2.1 --- Generate and Test --- p.92Chapter A.2.2 --- Standard Backtracking --- p.93Chapter A.2.3 --- Forward Checking --- p.94Chapter A.2.4 --- Looking Ahead --- p.95Chapter B --- Complete Results for Experiments --- p.96Chapter B.1 --- Complete Result for SATZ --- p.96Chapter B.1.1 --- n =5 --- p.95Chapter B.1.2 --- n = 10 --- p.98Chapter B.1.3 --- n = 30 --- p.99Chapter B.2 --- Complete Result for Basic Branch-and-Bound Algorithm --- p.101Chapter B.2.1 --- näŗŒ5 --- p.101Chapter B.2.2 --- n = 10 --- p.104Chapter B.2.3 --- n = 30 --- p.107Chapter B.3 --- Complete Result for Revised Bound Rule --- p.109Chapter B.3.1 --- n = 5 --- p.109Chapter B.3.2 --- n = 10 --- p.112Chapter B.3.3 --- n = 30 --- p.115Chapter B.4 --- Complete Result for Revised Branch-and-Bound Algorithm --- p.118Chapter B.4.1 --- n = 5 --- p.118Chapter B.4.2 --- n = 10 --- p.121Chapter B.4.3 --- n = 30 --- p.124Bibliography --- p.12

    Conic Optimization: Optimal Partition, Parametric, and Stability Analysis

    Get PDF
    A linear conic optimization problem consists of the minimization of a linear objective function over the intersection of an affine space and a closed convex cone. In recent years, linear conic optimization has received significant attention, partly due to the fact that we can take advantage of linear conic optimization to reformulate and approximate intractable optimization problems. Steady advances in computational optimization have enabled us to approximately solve a wide variety of linear conic optimization problems in polynomial time. Nevertheless, preprocessing methods, rounding procedures and sensitivity analysis tools are still the missing parts of conic optimization solvers. Given the output of a conic optimization solver, we need methodologies to generate approximate complementary solutions or to speed up the convergence to an exact optimal solution. A preprocessing method reduces the size of a problem by finding the minimal face of the cone which contains the set of feasible solutions. However, such a preprocessing method assumes the knowledge of an exact solution. More importantly, we need robust sensitivity and post-optimal analysis tools for an optimal solution of a linear conic optimization problem. Motivated by the vital importance of linear conic optimization, we take active steps to fill this gap.This thesis is concerned with several aspects of a linear conic optimization problem, from algorithm through solution identification, to parametric analysis, which have not been fully addressed in the literature. We specifically focus on three special classes of linear conic optimization problems, namely semidefinite and second-order conic optimization, and their common generalization, symmetric conic optimization. We propose a polynomial time algorithm for symmetric conic optimization problems. We show how to approximate/identify the optimal partition of semidefinite optimization and second-order conic optimization, a concept which has its origin in linear optimization. Further, we use the optimal partition information to either generate an approximate optimal solution or to speed up the convergence of a solution identification process to the unique optimal solution of the problem. Finally, we study the parametric analysis of semidefinite and second-order conic optimization problems. We investigate the behavior of the optimal partition and the optimal set mapping under perturbation of the objective function vector
    corecore