11,412 research outputs found

    Bin Packing and Related Problems: General Arc-flow Formulation with Graph Compression

    Full text link
    We present an exact method, based on an arc-flow formulation with side constraints, for solving bin packing and cutting stock problems --- including multi-constraint variants --- by simply representing all the patterns in a very compact graph. Our method includes a graph compression algorithm that usually reduces the size of the underlying graph substantially without weakening the model. As opposed to our method, which provides strong models, conventional models are usually highly symmetric and provide very weak lower bounds. Our formulation is equivalent to Gilmore and Gomory's, thus providing a very strong linear relaxation. However, instead of using column-generation in an iterative process, the method constructs a graph, where paths from the source to the target node represent every valid packing pattern. The same method, without any problem-specific parameterization, was used to solve a large variety of instances from several different cutting and packing problems. In this paper, we deal with vector packing, graph coloring, bin packing, cutting stock, cardinality constrained bin packing, cutting stock with cutting knife limitation, cutting stock with binary patterns, bin packing with conflicts, and cutting stock with binary patterns and forbidden pairs. We report computational results obtained with many benchmark test data sets, all of them showing a large advantage of this formulation with respect to the traditional ones

    A Riemannian low-rank method for optimization over semidefinite matrices with block-diagonal constraints

    Get PDF
    We propose a new algorithm to solve optimization problems of the form minf(X)\min f(X) for a smooth function ff under the constraints that XX is positive semidefinite and the diagonal blocks of XX are small identity matrices. Such problems often arise as the result of relaxing a rank constraint (lifting). In particular, many estimation tasks involving phases, rotations, orthonormal bases or permutations fit in this framework, and so do certain relaxations of combinatorial problems such as Max-Cut. The proposed algorithm exploits the facts that (1) such formulations admit low-rank solutions, and (2) their rank-restricted versions are smooth optimization problems on a Riemannian manifold. Combining insights from both the Riemannian and the convex geometries of the problem, we characterize when second-order critical points of the smooth problem reveal KKT points of the semidefinite problem. We compare against state of the art, mature software and find that, on certain interesting problem instances, what we call the staircase method is orders of magnitude faster, is more accurate and scales better. Code is available.Comment: 37 pages, 3 figure

    Finding a low-rank basis in a matrix subspace

    Full text link
    For a given matrix subspace, how can we find a basis that consists of low-rank matrices? This is a generalization of the sparse vector problem. It turns out that when the subspace is spanned by rank-1 matrices, the matrices can be obtained by the tensor CP decomposition. For the higher rank case, the situation is not as straightforward. In this work we present an algorithm based on a greedy process applicable to higher rank problems. Our algorithm first estimates the minimum rank by applying soft singular value thresholding to a nuclear norm relaxation, and then computes a matrix with that rank using the method of alternating projections. We provide local convergence results, and compare our algorithm with several alternative approaches. Applications include data compression beyond the classical truncated SVD, computing accurate eigenvectors of a near-multiple eigenvalue, image separation and graph Laplacian eigenproblems
    corecore