2,205 research outputs found

    A fast branch-and-bound algorithm for non-convex quadratic integer optimization subject to linear constraints using ellipsoidal relaxations

    Get PDF
    We propose two exact approaches for non-convex quadratic integer minimization subject to linear constraints where lower bounds are computed by considering ellipsoidal relaxations of the feasible set. In the first approach, we intersect the ellipsoids with the feasible linear subspace. In the second approach we penalize exactly the linear constraints. We investigate the connection between both approaches theoretically. Experimental results show that the penalty approach significantly outperforms CPLEX on problems with small or medium size variable domains. © 2015 Elsevier B.V. All rights reserved

    An Active Set Algorithm for Robust Combinatorial Optimization Based on Separation Oracles

    Get PDF
    We address combinatorial optimization problems with uncertain coefficients varying over ellipsoidal uncertainty sets. The robust counterpart of such a problem can be rewritten as a second-oder cone program (SOCP) with integrality constraints. We propose a branch-and-bound algorithm where dual bounds are computed by means of an active set algorithm. The latter is applied to the Lagrangian dual of the continuous relaxation, where the feasible set of the combinatorial problem is supposed to be given by a separation oracle. The method benefits from the closed form solution of the active set subproblems and from a smart update of pseudo-inverse matrices. We present numerical experiments on randomly generated instances and on instances from different combinatorial problems, including the shortest path and the traveling salesman problem, showing that our new algorithm consistently outperforms the state-of-the art mixed-integer SOCP solver of Gurobi

    Robust Monotonic Optimization Framework for Multicell MISO Systems

    Full text link
    The performance of multiuser systems is both difficult to measure fairly and to optimize. Most resource allocation problems are non-convex and NP-hard, even under simplifying assumptions such as perfect channel knowledge, homogeneous channel properties among users, and simple power constraints. We establish a general optimization framework that systematically solves these problems to global optimality. The proposed branch-reduce-and-bound (BRB) algorithm handles general multicell downlink systems with single-antenna users, multiantenna transmitters, arbitrary quadratic power constraints, and robustness to channel uncertainty. A robust fairness-profile optimization (RFO) problem is solved at each iteration, which is a quasi-convex problem and a novel generalization of max-min fairness. The BRB algorithm is computationally costly, but it shows better convergence than the previously proposed outer polyblock approximation algorithm. Our framework is suitable for computing benchmarks in general multicell systems with or without channel uncertainty. We illustrate this by deriving and evaluating a zero-forcing solution to the general problem.Comment: Published in IEEE Transactions on Signal Processing, 16 pages, 9 figures, 2 table

    Branch-and-lift algorithm for deterministic global optimization in nonlinear optimal control

    Get PDF
    This paper presents a branch-and-lift algorithm for solving optimal control problems with smooth nonlinear dynamics and potentially nonconvex objective and constraint functionals to guaranteed global optimality. This algorithm features a direct sequential method and builds upon a generic, spatial branch-and-bound algorithm. A new operation, called lifting, is introduced, which refines the control parameterization via a Gram-Schmidt orthogonalization process, while simultaneously eliminating control subregions that are either infeasible or that provably cannot contain any global optima. Conditions are given under which the image of the control parameterization error in the state space contracts exponentially as the parameterization order is increased, thereby making the lifting operation efficient. A computational technique based on ellipsoidal calculus is also developed that satisfies these conditions. The practical applicability of branch-and-lift is illustrated in a numerical example. © 2013 Springer Science+Business Media New York

    Submodular Optimization with Submodular Cover and Submodular Knapsack Constraints

    Full text link
    We investigate two new optimization problems -- minimizing a submodular function subject to a submodular lower bound constraint (submodular cover) and maximizing a submodular function subject to a submodular upper bound constraint (submodular knapsack). We are motivated by a number of real-world applications in machine learning including sensor placement and data subset selection, which require maximizing a certain submodular function (like coverage or diversity) while simultaneously minimizing another (like cooperative cost). These problems are often posed as minimizing the difference between submodular functions [14, 35] which is in the worst case inapproximable. We show, however, that by phrasing these problems as constrained optimization, which is more natural for many applications, we achieve a number of bounded approximation guarantees. We also show that both these problems are closely related and an approximation algorithm solving one can be used to obtain an approximation guarantee for the other. We provide hardness results for both problems thus showing that our approximation factors are tight up to log-factors. Finally, we empirically demonstrate the performance and good scalability properties of our algorithms.Comment: 23 pages. A short version of this appeared in Advances of NIPS-201
    corecore