6 research outputs found
Evaluating a 2-approximation algorithm for edge-separators in planar graphs
In this paper we report on results obtained by an implementation of a 2-approximation algorithm for edge separators in planar graphs. For 374 out of the 435 instances the algorithm returned the optimum solution. For the remaining instances the solution returned was never more than 10.6\% away from the lower bound on the optimum separator. We also improve the worst-case running time of the algorithm from to and present techniques which improve the running time significantly in practice
A Polynomial-time Bicriteria Approximation Scheme for Planar Bisection
Given an undirected graph with edge costs and node weights, the minimum
bisection problem asks for a partition of the nodes into two parts of equal
weight such that the sum of edge costs between the parts is minimized. We give
a polynomial time bicriteria approximation scheme for bisection on planar
graphs.
Specifically, let be the total weight of all nodes in a planar graph .
For any constant , our algorithm outputs a bipartition of the
nodes such that each part weighs at most and the total cost
of edges crossing the partition is at most times the total
cost of the optimal bisection. The previously best known approximation for
planar minimum bisection, even with unit node weights, was . Our
algorithm actually solves a more general problem where the input may include a
target weight for the smaller side of the bipartition.Comment: To appear in STOC 201
On the computational tractability of a geographic clustering problem arising in redistricting
Redistricting is the problem of dividing a state into a number of
regions, called districts. Voters in each district elect a representative. The
primary criteria are: each district is connected, district populations are
equal (or nearly equal), and districts are "compact". There are multiple
competing definitions of compactness, usually minimizing some quantity.
One measure that has been recently promoted by Duchin and others is number of
cut edges. In redistricting, one is given atomic regions out of which each
district must be built. The populations of the atomic regions are given.
Consider the graph with one vertex per atomic region (with weight equal to the
region's population) and an edge between atomic regions that share a boundary.
A districting plan is a partition of vertices into parts, each connnected,
of nearly equal weight. The districts are considered compact to the extent that
the plan minimizes the number of edges crossing between different parts.
Consider two problems: find the most compact districting plan, and sample
districting plans under a compactness constraint uniformly at random. Both
problems are NP-hard so we restrict the input graph to have branchwidth at most
. (A planar graph's branchwidth is bounded by its diameter.) If both and
are bounded by constants, the problems are solvable in polynomial time.
Assume vertices have weight~1. One would like algorithms whose running times
are of the form for some constant independent of and
, in which case the problems are said to be fixed-parameter tractable with
respect to and ). We show that, under a complexity-theoretic assumption,
no such algorithms exist. However, we do give algorithms with running time
. Thus if the diameter of the graph is moderately small and the
number of districts is very small, our algorithm is useable
Mimicking Networks and Succinct Representations of Terminal Cuts
Given a large edge-weighted network with terminal vertices, we wish
to compress it and store, using little memory, the value of the minimum cut (or
equivalently, maximum flow) between every bipartition of terminals. One
appealing methodology to implement a compression of is to construct a
\emph{mimicking network}: a small network with the same terminals, in
which the minimum cut value between every bipartition of terminals is the same
as in . This notion was introduced by Hagerup, Katajainen, Nishimura, and
Ragde [JCSS '98], who proved that such of size at most always
exists. Obviously, by having access to the smaller network , certain
computations involving cuts can be carried out much more efficiently.
We provide several new bounds, which together narrow the previously known gap
from doubly-exponential to only singly-exponential, both for planar and for
general graphs. Our first and main result is that every -terminal planar
network admits a mimicking network of size , which is
moreover a minor of . On the other hand, some planar networks require
. For general networks, we show that certain bipartite
graphs only admit mimicking networks of size , and
moreover, every data structure that stores the minimum cut value between all
bipartitions of the terminals must use machine words
On the Computational Tractability of a Geographic Clustering Problem Arising in Redistricting
Redistricting is the problem of dividing up a state into a given number k of regions (called districts) where the voters in each district are to elect a representative. The three primary criteria are: that each district be connected, that the populations of the districts be equal (or nearly equal), and that the districts are "compact". There are multiple competing definitions of compactness, usually minimizing some quantity.
One measure that has been recently been used is number of cut edges. In this formulation of redistricting, one is given atomic regions out of which each district must be built (e.g., in the U.S., census blocks). The populations of the atomic regions are given. Consider the graph with one vertex per atomic region and an edge between atomic regions with a shared boundary of positive length. Define the weight of a vertex to be the population of the corresponding region. A districting plan is a partition of vertices into k pieces so that the parts have nearly equal weights and each part is connected. The districts are considered compact to the extent that the plan minimizes the number of edges crossing between different parts.
There are two natural computational problems: find the most compact districting plan, and sample districting plans (possibly under a compactness constraint) uniformly at random.
Both problems are NP-hard so we consider restricting the input graph to have branchwidth at most w. (A planar graph’s branchwidth is bounded, for example, by its diameter.) If both k and w are bounded by constants, the problems are solvable in polynomial time. In this paper, we give lower and upper bounds that characterize the complexity of these problems in terms of parameters k and w. For simplicity of notation, assume that each vertex has unit weight. We would ideally like algorithms whose running times are of the form O(f(k,w) n^c) for some constant c independent of k and w (in which case the problems are said to be fixed-parameter tractable with respect to those parameters). We show that, under standard complexity-theoretic assumptions, no such algorithms exist. However, the problems are fixed-parameter tractable with respect to each of these parameters individually: there exist algorithms with running times of the form O(f(k) n^{O(w)}) and O(f(w) n^{k+1}). The first result was previously known. The new one, however, is more relevant to the application to redistricting, at least for coarse instances. Indeed, we have implemented a version of the algorithm and have used to successfully find optimally compact solutions to all redistricting instances for France (except Paris, which operates under different rules) under various population-balance constraints. For these instances, the values for w are modest and the values for k are very small