2,025 research outputs found
Fast and Deterministic Approximations for k-Cut
In an undirected graph, a k-cut is a set of edges whose removal breaks the graph into at least k connected components. The minimum weight k-cut can be computed in n^O(k) time, but when k is treated as part of the input, computing the minimum weight k-cut is NP-Hard [Goldschmidt and Hochbaum, 1994]. For poly(m,n,k)-time algorithms, the best possible approximation factor is essentially 2 under the small set expansion hypothesis [Manurangsi, 2017]. Saran and Vazirani [1995] showed that a (2 - 2/k)-approximately minimum weight k-cut can be computed via O(k) minimum cuts, which implies a O~(km) randomized running time via the nearly linear time randomized min-cut algorithm of Karger [2000]. Nagamochi and Kamidoi [2007] showed that a (2 - 2/k)-approximately minimum weight k-cut can be computed deterministically in O(mn + n^2 log n) time. These results prompt two basic questions. The first concerns the role of randomization. Is there a deterministic algorithm for 2-approximate k-cuts matching the randomized running time of O~(km)? The second question qualitatively compares minimum cut to 2-approximate minimum k-cut. Can 2-approximate k-cuts be computed as fast as the minimum cut - in O~(m) randomized time?
We give a deterministic approximation algorithm that computes (2 + eps)-minimum k-cuts in O(m log^3 n / eps^2) time, via a (1 + eps)-approximation for an LP relaxation of k-cut
Truthful Facility Assignment with Resource Augmentation: An Exact Analysis of Serial Dictatorship
We study the truthful facility assignment problem, where a set of agents with
private most-preferred points on a metric space are assigned to facilities that
lie on the metric space, under capacity constraints on the facilities. The goal
is to produce such an assignment that minimizes the social cost, i.e., the
total distance between the most-preferred points of the agents and their
corresponding facilities in the assignment, under the constraint of
truthfulness, which ensures that agents do not misreport their most-preferred
points.
We propose a resource augmentation framework, where a truthful mechanism is
evaluated by its worst-case performance on an instance with enhanced facility
capacities against the optimal mechanism on the same instance with the original
capacities. We study a very well-known mechanism, Serial Dictatorship, and
provide an exact analysis of its performance. Although Serial Dictatorship is a
purely combinatorial mechanism, our analysis uses linear programming; a linear
program expresses its greedy nature as well as the structure of the input, and
finds the input instance that enforces the mechanism have its worst-case
performance. Bounding the objective of the linear program using duality
arguments allows us to compute tight bounds on the approximation ratio. Among
other results, we prove that Serial Dictatorship has approximation ratio
when the capacities are multiplied by any integer . Our
results suggest that even a limited augmentation of the resources can have
wondrous effects on the performance of the mechanism and in particular, the
approximation ratio goes to 1 as the augmentation factor becomes large. We
complement our results with bounds on the approximation ratio of Random Serial
Dictatorship, the randomized version of Serial Dictatorship, when there is no
resource augmentation
Survivability in Time-varying Networks
Time-varying graphs are a useful model for networks with dynamic connectivity
such as vehicular networks, yet, despite their great modeling power, many
important features of time-varying graphs are still poorly understood. In this
paper, we study the survivability properties of time-varying networks against
unpredictable interruptions. We first show that the traditional definition of
survivability is not effective in time-varying networks, and propose a new
survivability framework. To evaluate the survivability of time-varying networks
under the new framework, we propose two metrics that are analogous to MaxFlow
and MinCut in static networks. We show that some fundamental
survivability-related results such as Menger's Theorem only conditionally hold
in time-varying networks. Then we analyze the complexity of computing the
proposed metrics and develop several approximation algorithms. Finally, we
conduct trace-driven simulations to demonstrate the application of our
survivability framework to the robust design of a real-world bus communication
network
Throughput Optimal On-Line Algorithms for Advanced Resource Reservation in Ultra High-Speed Networks
Advanced channel reservation is emerging as an important feature of ultra
high-speed networks requiring the transfer of large files. Applications include
scientific data transfers and database backup. In this paper, we present two
new, on-line algorithms for advanced reservation, called BatchAll and BatchLim,
that are guaranteed to achieve optimal throughput performance, based on
multi-commodity flow arguments. Both algorithms are shown to have
polynomial-time complexity and provable bounds on the maximum delay for
1+epsilon bandwidth augmented networks. The BatchLim algorithm returns the
completion time of a connection immediately as a request is placed, but at the
expense of a slightly looser competitive ratio than that of BatchAll. We also
present a simple approach that limits the number of parallel paths used by the
algorithms while provably bounding the maximum reduction factor in the
transmission throughput. We show that, although the number of different paths
can be exponentially large, the actual number of paths needed to approximate
the flow is quite small and proportional to the number of edges in the network.
Simulations for a number of topologies show that, in practice, 3 to 5 parallel
paths are sufficient to achieve close to optimal performance. The performance
of the competitive algorithms are also compared to a greedy benchmark, both
through analysis and simulation.Comment: 9 pages, 8 figure
Fat Polygonal Partitions with Applications to Visualization and Embeddings
Let be a rooted and weighted tree, where the weight of any node
is equal to the sum of the weights of its children. The popular Treemap
algorithm visualizes such a tree as a hierarchical partition of a square into
rectangles, where the area of the rectangle corresponding to any node in
is equal to the weight of that node. The aspect ratio of the
rectangles in such a rectangular partition necessarily depends on the weights
and can become arbitrarily high.
We introduce a new hierarchical partition scheme, called a polygonal
partition, which uses convex polygons rather than just rectangles. We present
two methods for constructing polygonal partitions, both having guarantees on
the worst-case aspect ratio of the constructed polygons; in particular, both
methods guarantee a bound on the aspect ratio that is independent of the
weights of the nodes.
We also consider rectangular partitions with slack, where the areas of the
rectangles may differ slightly from the weights of the corresponding nodes. We
show that this makes it possible to obtain partitions with constant aspect
ratio. This result generalizes to hyper-rectangular partitions in
. We use these partitions with slack for embedding ultrametrics
into -dimensional Euclidean space: we give a -approximation algorithm for embedding -point ultrametrics
into with minimum distortion, where denotes the spread
of the metric, i.e., the ratio between the largest and the smallest distance
between two points. The previously best-known approximation ratio for this
problem was polynomial in . This is the first algorithm for embedding a
non-trivial family of weighted-graph metrics into a space of constant dimension
that achieves polylogarithmic approximation ratio.Comment: 26 page
- …