9,464 research outputs found

    Stabilized Benders methods for large-scale combinatorial optimization, with appllication to data privacy

    Get PDF
    The Cell Suppression Problem (CSP) is a challenging Mixed-Integer Linear Problem arising in statistical tabular data protection. Medium sized instances of CSP involve thousands of binary variables and million of continuous variables and constraints. However, CSP has the typical structure that allows application of the renowned Benders’ decomposition method: once the “complicating” binary variables are fixed, the problem decomposes into a large set of linear subproblems on the “easy” continuous ones. This allows to project away the easy variables, reducing to a master problem in the complicating ones where the value functions of the subproblems are approximated with the standard cutting-plane approach. Hence, Benders’ decomposition suffers from the same drawbacks of the cutting-plane method, i.e., oscillation and slow convergence, compounded with the fact that the master problem is combinatorial. To overcome this drawback we present a stabilized Benders decomposition whose master is restricted to a neighborhood of successful candidates by local branching constraints, which are dynamically adjusted, and even dropped, during the iterations. Our experiments with randomly generated and real-world CSP instances with up to 3600 binary variables, 90M continuous variables and 15M inequality constraints show that our approach is competitive with both the current state-of-the-art (cutting-plane-based) code for cell suppression, and the Benders implementation in CPLEX 12.7. In some instances, stabilized Benders is able to quickly provide a very good solution in less than one minute, while the other approaches were not able to find any feasible solution in one hour.Peer ReviewedPreprin

    Reformulation and decomposition of integer programs

    Get PDF
    In this survey we examine ways to reformulate integer and mixed integer programs. Typically, but not exclusively, one reformulates so as to obtain stronger linear programming relaxations, and hence better bounds for use in a branch-and-bound based algorithm. First we cover in detail reformulations based on decomposition, such as Lagrangean relaxation, Dantzig-Wolfe column generation and the resulting branch-and-price algorithms. This is followed by an examination of Benders’ type algorithms based on projection. Finally we discuss in detail extended formulations involving additional variables that are based on problem structure. These can often be used to provide strengthened a priori formulations. Reformulations obtained by adding cutting planes in the original variables are not treated here.Integer program, Lagrangean relaxation, column generation, branch-and-price, extended formulation, Benders' algorithm

    Frequency-Aware Model Predictive Control

    Full text link
    Transferring solutions found by trajectory optimization to robotic hardware remains a challenging task. When the optimization fully exploits the provided model to perform dynamic tasks, the presence of unmodeled dynamics renders the motion infeasible on the real system. Model errors can be a result of model simplifications, but also naturally arise when deploying the robot in unstructured and nondeterministic environments. Predominantly, compliant contacts and actuator dynamics lead to bandwidth limitations. While classical control methods provide tools to synthesize controllers that are robust to a class of model errors, such a notion is missing in modern trajectory optimization, which is solved in the time domain. We propose frequency-shaped cost functions to achieve robust solutions in the context of optimal control for legged robots. Through simulation and hardware experiments we show that motion plans can be made compatible with bandwidth limits set by actuators and contact dynamics. The smoothness of the model predictive solutions can be continuously tuned without compromising the feasibility of the problem. Experiments with the quadrupedal robot ANYmal, which is driven by highly-compliant series elastic actuators, showed significantly improved tracking performance of the planned motion, torque, and force trajectories and enabled the machine to walk robustly on terrain with unmodeled compliance

    A branch, price, and cut approach to solving the maximum weighted independent set problem

    Get PDF
    The maximum weight-independent set problem (MWISP) is one of the most well-known and well-studied NP-hard problems in the field of combinatorial optimization. In the first part of the dissertation, I explore efficient branch-and-price (B&P) approaches to solve MWISP exactly. B&P is a useful integer-programming tool for solving NP-hard optimization problems. Specifically, I look at vertex- and edge-disjoint decompositions of the underlying graph. MWISPâÂÂs on the resulting subgraphs are less challenging, on average, to solve. I use the B&P framework to solve MWISP on the original graph G using these specially constructed subproblems to generate columns. I demonstrate that vertex-disjoint partitioning scheme gives an effective approach for relatively sparse graphs. I also show that the edge-disjoint approach is less effective than the vertex-disjoint scheme because the associated DWD reformulation of the latter entails a slow rate of convergence. In the second part of the dissertation, I address convergence properties associated with Dantzig-Wolfe Decomposition (DWD). I discuss prevalent methods for improving the rate of convergence of DWD. I also implement specific methods in application to the edge-disjoint B&P scheme and show that these methods improve the rate of convergence. In the third part of the dissertation, I focus on identifying new cut-generation methods within the B&P framework. Such methods have not been explored in the literature. I present two new methodologies for generating generic cutting planes within the B&P framework. These techniques are not limited to MWISP and can be used in general applications of B&P. The first methodology generates cuts by identifying faces (facets) of subproblem polytopes and lifting associated inequalities; the second methodology computes Lift-and-Project (L&P) cuts within B&P. I successfully demonstrate the feasibility of both approaches and present preliminary computational tests of each

    A Column Generation Approach for Pure Parsimony Haplotyping

    Get PDF

    Optimal Power Flow in Stand-alone DC Microgrids

    Get PDF
    Direct-current microgrids (DC-MGs) can operate in either grid-connected or stand-alone mode. In particular, stand-alone DC-MG has many distinct applications. However, the optimal power flow problem of a stand-alone DC-MG is inherently non-convex. In this paper, the optimal power flow (OPF) problem of DC-MG is investigated considering convex relaxation based on second-order cone programming (SOCP). Mild assumptions are proposed to guarantee the exactness of relaxation, which only require uniform nodal voltage upper bounds and positive network loss. Furthermore, it is revealed that the exactness of SOCP relaxation of DC-MGs does not rely on either topology or operating mode of DC-MGs, and an optimal solution must be unique if it exists. If line constraints are considered, the exactness of SOCP relaxation may not hold. In this regard, two heuristic methods are proposed to give approximate solutions. Simulations are conducted to confirm the theoretic results

    Standard Bundle Methods: Untrusted Models and Duality

    Get PDF
    We review the basic ideas underlying the vast family of algorithms for nonsmooth convex optimization known as "bundle methods|. In a nutshell, these approaches are based on constructing models of the function, but lack of continuity of first-order information implies that these models cannot be trusted, not even close to an optimum. Therefore, many different forms of stabilization have been proposed to try to avoid being led to areas where the model is so inaccurate as to result in almost useless steps. In the development of these methods, duality arguments are useful, if not outright necessary, to better analyze the behaviour of the algorithms. Also, in many relevant applications the function at hand is itself a dual one, so that duality allows to map back algorithmic concepts and results into a "primal space" where they can be exploited; in turn, structure in that space can be exploited to improve the algorithms' behaviour, e.g. by developing better models. We present an updated picture of the many developments around the basic idea along at least three different axes: form of the stabilization, form of the model, and approximate evaluation of the function
    corecore