8,695 research outputs found
Brief Announcement: Almost-Tight Approximation Distributed Algorithm for Minimum Cut
In this short paper, we present an improved algorithm for approximating the
minimum cut on distributed (CONGEST) networks. Let be the minimum
cut. Our algorithm can compute exactly in
\tilde{O}((\sqrt{n}+D)\poly(\lambda)) time, where is the number of nodes
(processors) in the network, is the network diameter, and hides
\poly\log n. By a standard reduction, we can convert this algorithm into a
-approximation \tilde{O}((\sqrt{n}+D)/\poly(\epsilon))-time
algorithm. The latter result improves over the previous
-approximation \tilde{O}((\sqrt{n}+D)/\poly(\epsilon))-time
algorithm of Ghaffari and Kuhn [DISC 2013]. Due to the lower bound of
by Das Sarma et al. [SICOMP 2013], this running
time is {\em tight} up to a \poly\log n factor. Our algorithm is an extremely
simple combination of Thorup's tree packing theorem [Combinatorica 2007],
Kutten and Peleg's tree partitioning algorithm [J. Algorithms 1998], and
Karger's dynamic programming [JACM 2000].Comment: To appear as a brief announcement at PODC 201
Pricing for Online Resource Allocation: Intervals and Paths
We present pricing mechanisms for several online resource allocation problems
which obtain tight or nearly tight approximations to social welfare. In our
settings, buyers arrive online and purchase bundles of items; buyers' values
for the bundles are drawn from known distributions. This problem is closely
related to the so-called prophet-inequality of Krengel and Sucheston and its
extensions in recent literature. Motivated by applications to cloud economics,
we consider two kinds of buyer preferences. In the first, items correspond to
different units of time at which a resource is available; the items are
arranged in a total order and buyers desire intervals of items. The second
corresponds to bandwidth allocation over a tree network; the items are edges in
the network and buyers desire paths.
Because buyers' preferences have complementarities in the settings we
consider, recent constant-factor approximations via item prices do not apply,
and indeed strong negative results are known. We develop static, anonymous
bundle pricing mechanisms.
For the interval preferences setting, we show that static, anonymous bundle
pricings achieve a sublogarithmic competitive ratio, which is optimal (within
constant factors) over the class of all online allocation algorithms, truthful
or not. For the path preferences setting, we obtain a nearly-tight logarithmic
competitive ratio. Both of these results exhibit an exponential improvement
over item pricings for these settings. Our results extend to settings where the
seller has multiple copies of each item, with the competitive ratio decreasing
linearly with supply. Such a gradual tradeoff between supply and the
competitive ratio for welfare was previously known only for the single item
prophet inequality
Tight Bounds for Gomory-Hu-like Cut Counting
By a classical result of Gomory and Hu (1961), in every edge-weighted graph
, the minimum -cut values, when ranging over all ,
take at most distinct values. That is, these instances
exhibit redundancy factor . They further showed how to construct
from a tree that stores all minimum -cut values. Motivated
by this result, we obtain tight bounds for the redundancy factor of several
generalizations of the minimum -cut problem.
1. Group-Cut: Consider the minimum -cut, ranging over all subsets
of given sizes and . The redundancy
factor is .
2. Multiway-Cut: Consider the minimum cut separating every two vertices of
, ranging over all subsets of a given size . The
redundancy factor is .
3. Multicut: Consider the minimum cut separating every demand-pair in
, ranging over collections of demand pairs. The
redundancy factor is . This result is a bit surprising, as
the redundancy factor is much larger than in the first two problems.
A natural application of these bounds is to construct small data structures
that stores all relevant cut values, like the Gomory-Hu tree. We initiate this
direction by giving some upper and lower bounds.Comment: This version contains additional references to previous work (which
have some overlap with our results), see Bibliographic Update 1.
Fat Polygonal Partitions with Applications to Visualization and Embeddings
Let be a rooted and weighted tree, where the weight of any node
is equal to the sum of the weights of its children. The popular Treemap
algorithm visualizes such a tree as a hierarchical partition of a square into
rectangles, where the area of the rectangle corresponding to any node in
is equal to the weight of that node. The aspect ratio of the
rectangles in such a rectangular partition necessarily depends on the weights
and can become arbitrarily high.
We introduce a new hierarchical partition scheme, called a polygonal
partition, which uses convex polygons rather than just rectangles. We present
two methods for constructing polygonal partitions, both having guarantees on
the worst-case aspect ratio of the constructed polygons; in particular, both
methods guarantee a bound on the aspect ratio that is independent of the
weights of the nodes.
We also consider rectangular partitions with slack, where the areas of the
rectangles may differ slightly from the weights of the corresponding nodes. We
show that this makes it possible to obtain partitions with constant aspect
ratio. This result generalizes to hyper-rectangular partitions in
. We use these partitions with slack for embedding ultrametrics
into -dimensional Euclidean space: we give a -approximation algorithm for embedding -point ultrametrics
into with minimum distortion, where denotes the spread
of the metric, i.e., the ratio between the largest and the smallest distance
between two points. The previously best-known approximation ratio for this
problem was polynomial in . This is the first algorithm for embedding a
non-trivial family of weighted-graph metrics into a space of constant dimension
that achieves polylogarithmic approximation ratio.Comment: 26 page
Minimum Cuts in Near-Linear Time
We significantly improve known time bounds for solving the minimum cut
problem on undirected graphs. We use a ``semi-duality'' between minimum cuts
and maximum spanning tree packings combined with our previously developed
random sampling techniques. We give a randomized algorithm that finds a minimum
cut in an m-edge, n-vertex graph with high probability in O(m log^3 n) time. We
also give a simpler randomized algorithm that finds all minimum cuts with high
probability in O(n^2 log n) time. This variant has an optimal RNC
parallelization. Both variants improve on the previous best time bound of O(n^2
log^3 n). Other applications of the tree-packing approach are new, nearly tight
bounds on the number of near minimum cuts a graph may have and a new data
structure for representing them in a space-efficient manner
- …