2,430 research outputs found
LP Relaxation and Tree Packing for Minimum k-cuts
Karger used spanning tree packings [Karger, 2000] to derive a near linear-time randomized algorithm for the global minimum cut problem as well as a bound on the number of approximate minimum cuts. This is a different approach from his well-known random contraction algorithm [Karger, 1995; Karger and Stein, 1996]. Thorup developed a fast deterministic algorithm for the minimum k-cut problem via greedy recursive tree packings [Thorup, 2008].
In this paper we revisit properties of an LP relaxation for k-cut proposed by Naor and Rabani [Naor and Rabani, 2001], and analyzed in [Chekuri et al., 2006]. We show that the dual of the LP yields a tree packing, that when combined with an upper bound on the integrality gap for the LP, easily and transparently extends Karger\u27s analysis for mincut to the k-cut problem. In addition to the simplicity of the algorithm and its analysis, this allows us to improve the running time of Thorup\u27s algorithm by a factor of n. We also improve the bound on the number of alpha-approximate k-cuts. Second, we give a simple proof that the integrality gap of the LP is 2(1-1/n). Third, we show that an optimum solution to the LP relaxation, for all values of k, is fully determined by the principal sequence of partitions of the input graph. This allows us to relate the LP relaxation to the Lagrangean relaxation approach of Barahona [Barahona, 2000] and Ravi and Sinha [Ravi and Sinha, 2008]; it also shows that the idealized recursive tree packing considered by Thorup gives an optimum dual solution to the LP. This work arose from an effort to understand and simplify the results of Thorup [Thorup, 2008]
Fast and Deterministic Approximations for k-Cut
In an undirected graph, a k-cut is a set of edges whose removal breaks the graph into at least k connected components. The minimum weight k-cut can be computed in n^O(k) time, but when k is treated as part of the input, computing the minimum weight k-cut is NP-Hard [Goldschmidt and Hochbaum, 1994]. For poly(m,n,k)-time algorithms, the best possible approximation factor is essentially 2 under the small set expansion hypothesis [Manurangsi, 2017]. Saran and Vazirani [1995] showed that a (2 - 2/k)-approximately minimum weight k-cut can be computed via O(k) minimum cuts, which implies a O~(km) randomized running time via the nearly linear time randomized min-cut algorithm of Karger [2000]. Nagamochi and Kamidoi [2007] showed that a (2 - 2/k)-approximately minimum weight k-cut can be computed deterministically in O(mn + n^2 log n) time. These results prompt two basic questions. The first concerns the role of randomization. Is there a deterministic algorithm for 2-approximate k-cuts matching the randomized running time of O~(km)? The second question qualitatively compares minimum cut to 2-approximate minimum k-cut. Can 2-approximate k-cuts be computed as fast as the minimum cut - in O~(m) randomized time?
We give a deterministic approximation algorithm that computes (2 + eps)-minimum k-cuts in O(m log^3 n / eps^2) time, via a (1 + eps)-approximation for an LP relaxation of k-cut
Approximating the Held-Karp Bound for Metric TSP in Nearly Linear Time
We give a nearly linear time randomized approximation scheme for the
Held-Karp bound [Held and Karp, 1970] for metric TSP. Formally, given an
undirected edge-weighted graph on edges and , the
algorithm outputs in time, with high probability, a
-approximation to the Held-Karp bound on the metric TSP instance
induced by the shortest path metric on . The algorithm can also be used to
output a corresponding solution to the Subtour Elimination LP. We substantially
improve upon the running time achieved previously
by Garg and Khandekar. The LP solution can be used to obtain a fast randomized
-approximation for metric TSP which improves
upon the running time of previous implementations of Christofides' algorithm
Recommended from our members
Zero-one IP problems: Polyhedral descriptions & cutting plane procedures
A systematic way for tightening an IP formulation is by employing classes of linear inequalities that define facets of the convex hull of the feasible integer points of the respective problems. Describing as well as identifying these inequalities will help in the efficiency of the LP-based cutting plane methods. In this report, we review classes of inequalities that partially described zero-one poly topes such as the 0-1 knapsack polytope, the set packing polytope and the travelling salesman polytope. Facets or valid inequalities derived from the 0-1 knapsack and the set packing polytopes are algorithmically identifie
Reformulation and decomposition of integer programs
In this survey we examine ways to reformulate integer and mixed integer programs. Typically, but not exclusively, one reformulates so as to obtain stronger linear programming relaxations, and hence better bounds for use in a branch-and-bound based algorithm. First we cover in detail reformulations based on decomposition, such as Lagrangean relaxation, Dantzig-Wolfe column generation and the resulting branch-and-price algorithms. This is followed by an examination of Benders’ type algorithms based on projection. Finally we discuss in detail extended formulations involving additional variables that are based on problem structure. These can often be used to provide strengthened a priori formulations. Reformulations obtained by adding cutting planes in the original variables are not treated here.Integer program, Lagrangean relaxation, column generation, branch-and-price, extended formulation, Benders' algorithm
Half-integrality, LP-branching and FPT Algorithms
A recent trend in parameterized algorithms is the application of polytope
tools (specifically, LP-branching) to FPT algorithms (e.g., Cygan et al., 2011;
Narayanaswamy et al., 2012). However, although interesting results have been
achieved, the methods require the underlying polytope to have very restrictive
properties (half-integrality and persistence), which are known only for few
problems (essentially Vertex Cover (Nemhauser and Trotter, 1975) and Node
Multiway Cut (Garg et al., 1994)). Taking a slightly different approach, we
view half-integrality as a \emph{discrete} relaxation of a problem, e.g., a
relaxation of the search space from to such that
the new problem admits a polynomial-time exact solution. Using tools from CSP
(in particular Thapper and \v{Z}ivn\'y, 2012) to study the existence of such
relaxations, we provide a much broader class of half-integral polytopes with
the required properties, unifying and extending previously known cases.
In addition to the insight into problems with half-integral relaxations, our
results yield a range of new and improved FPT algorithms, including an
-time algorithm for node-deletion Unique Label Cover with
label set and an -time algorithm for Group Feedback Vertex
Set, including the setting where the group is only given by oracle access. All
these significantly improve on previous results. The latter result also implies
the first single-exponential time FPT algorithm for Subset Feedback Vertex Set,
answering an open question of Cygan et al. (2012).
Additionally, we propose a network flow-based approach to solve some cases of
the relaxation problem. This gives the first linear-time FPT algorithm to
edge-deletion Unique Label Cover.Comment: Added results on linear-time FPT algorithms (not present in SODA
paper
Combinatorial persistency criteria for multicut and max-cut
In combinatorial optimization, partial variable assignments are called
persistent if they agree with some optimal solution. We propose persistency
criteria for the multicut and max-cut problem as well as fast combinatorial
routines to verify them. The criteria that we derive are based on mappings that
improve feasible multicuts, respectively cuts. Our elementary criteria can be
checked enumeratively. The more advanced ones rely on fast algorithms for upper
and lower bounds for the respective cut problems and max-flow techniques for
auxiliary min-cut problems. Our methods can be used as a preprocessing
technique for reducing problem sizes or for computing partial optimality
guarantees for solutions output by heuristic solvers. We show the efficacy of
our methods on instances of both problems from computer vision, biomedical
image analysis and statistical physics
- …