53,152 research outputs found

    Efficient Semidefinite Branch-and-Cut for MAP-MRF Inference

    Full text link
    We propose a Branch-and-Cut (B&C) method for solving general MAP-MRF inference problems. The core of our method is a very efficient bounding procedure, which combines scalable semidefinite programming (SDP) and a cutting-plane method for seeking violated constraints. In order to further speed up the computation, several strategies have been exploited, including model reduction, warm start and removal of inactive constraints. We analyze the performance of the proposed method under different settings, and demonstrate that our method either outperforms or performs on par with state-of-the-art approaches. Especially when the connectivities are dense or when the relative magnitudes of the unary costs are low, we achieve the best reported results. Experiments show that the proposed algorithm achieves better approximation than the state-of-the-art methods within a variety of time budgets on challenging non-submodular MAP-MRF inference problems.Comment: 21 page

    A Novel Method for the Absolute Pose Problem with Pairwise Constraints

    Full text link
    Absolute pose estimation is a fundamental problem in computer vision, and it is a typical parameter estimation problem, meaning that efforts to solve it will always suffer from outlier-contaminated data. Conventionally, for a fixed dimensionality d and the number of measurements N, a robust estimation problem cannot be solved faster than O(N^d). Furthermore, it is almost impossible to remove d from the exponent of the runtime of a globally optimal algorithm. However, absolute pose estimation is a geometric parameter estimation problem, and thus has special constraints. In this paper, we consider pairwise constraints and propose a globally optimal algorithm for solving the absolute pose estimation problem. The proposed algorithm has a linear complexity in the number of correspondences at a given outlier ratio. Concretely, we first decouple the rotation and the translation subproblems by utilizing the pairwise constraints, and then we solve the rotation subproblem using the branch-and-bound algorithm. Lastly, we estimate the translation based on the known rotation by using another branch-and-bound algorithm. The advantages of our method are demonstrated via thorough testing on both synthetic and real-world dataComment: 10 pages, 7figure

    Certification of inequalities involving transcendental functions: combining SDP and max-plus approximation

    Get PDF
    We consider the problem of certifying an inequality of the form f(x)0f(x)\geq 0, xK\forall x\in K, where ff is a multivariate transcendental function, and KK is a compact semialgebraic set. We introduce a certification method, combining semialgebraic optimization and max-plus approximation. We assume that ff is given by a syntaxic tree, the constituents of which involve semialgebraic operations as well as some transcendental functions like cos\cos, sin\sin, exp\exp, etc. We bound some of these constituents by suprema or infima of quadratic forms (max-plus approximation method, initially introduced in optimal control), leading to semialgebraic optimization problems which we solve by semidefinite relaxations. The max-plus approximation is iteratively refined and combined with branch and bound techniques to reduce the relaxation gap. Illustrative examples of application of this algorithm are provided, explaining how we solved tight inequalities issued from the Flyspeck project (one of the main purposes of which is to certify numerical inequalities used in the proof of the Kepler conjecture by Thomas Hales).Comment: 7 pages, 3 figures, 3 tables, Appears in the Proceedings of the European Control Conference ECC'13, July 17-19, 2013, Zurich, pp. 2244--2250, copyright EUCA 201

    BPGrad: Towards Global Optimality in Deep Learning via Branch and Pruning

    Full text link
    Understanding the global optimality in deep learning (DL) has been attracting more and more attention recently. Conventional DL solvers, however, have not been developed intentionally to seek for such global optimality. In this paper we propose a novel approximation algorithm, BPGrad, towards optimizing deep models globally via branch and pruning. Our BPGrad algorithm is based on the assumption of Lipschitz continuity in DL, and as a result it can adaptively determine the step size for current gradient given the history of previous updates, wherein theoretically no smaller steps can achieve the global optimality. We prove that, by repeating such branch-and-pruning procedure, we can locate the global optimality within finite iterations. Empirically an efficient solver based on BPGrad for DL is proposed as well, and it outperforms conventional DL solvers such as Adagrad, Adadelta, RMSProp, and Adam in the tasks of object recognition, detection, and segmentation
    corecore