163,310 research outputs found
Online unit clustering in higher dimensions
We revisit the online Unit Clustering and Unit Covering problems in higher
dimensions: Given a set of points in a metric space, that arrive one by
one, Unit Clustering asks to partition the points into the minimum number of
clusters (subsets) of diameter at most one; while Unit Covering asks to cover
all points by the minimum number of balls of unit radius. In this paper, we
work in using the norm.
We show that the competitive ratio of any online algorithm (deterministic or
randomized) for Unit Clustering must depend on the dimension . We also give
a randomized online algorithm with competitive ratio for Unit
Clustering}of integer points (i.e., points in , , under norm). We show that the competitive ratio of
any deterministic online algorithm for Unit Covering is at least . This
ratio is the best possible, as it can be attained by a simple deterministic
algorithm that assigns points to a predefined set of unit cubes. We complement
these results with some additional lower bounds for related problems in higher
dimensions.Comment: 15 pages, 4 figures. A preliminary version appeared in the
Proceedings of the 15th Workshop on Approximation and Online Algorithms (WAOA
2017
The Complexity of Distributed Approximation of Packing and Covering Integer Linear Programs
In this paper, we present a low-diameter decomposition algorithm in the LOCAL
model of distributed computing that succeeds with probability .
Specifically, we show how to compute an low-diameter decomposition in
round
Further developing our techniques, we show new distributed algorithms for
approximating general packing and covering integer linear programs in the LOCAL
model. For packing problems, our algorithm finds an -approximate
solution in rounds
with probability . For covering problems, our algorithm finds an
-approximate solution in rounds with probability . These results improve upon the previous -round algorithm by Ghaffari, Kuhn, and Maus [STOC 2017]
which is based on network decompositions.
Our algorithms are near-optimal for many fundamental combinatorial graph
optimization problems in the LOCAL model, such as minimum vertex cover and
minimum dominating set, as their -approximate solutions
require rounds to compute.Comment: To appear in PODC 202
On the Complexity of Local Distributed Graph Problems
This paper is centered on the complexity of graph problems in the
well-studied LOCAL model of distributed computing, introduced by Linial [FOCS
'87]. It is widely known that for many of the classic distributed graph
problems (including maximal independent set (MIS) and -vertex
coloring), the randomized complexity is at most polylogarithmic in the size
of the network, while the best deterministic complexity is typically
. Understanding and narrowing down this exponential gap
is considered to be one of the central long-standing open questions in the area
of distributed graph algorithms. We investigate the problem by introducing a
complexity-theoretic framework that allows us to shed some light on the role of
randomness in the LOCAL model. We define the SLOCAL model as a sequential
version of the LOCAL model. Our framework allows us to prove completeness
results with respect to the class of problems which can be solved efficiently
in the SLOCAL model, implying that if any of the complete problems can be
solved deterministically in rounds in the LOCAL model, we can
deterministically solve all efficient SLOCAL-problems (including MIS and
-coloring) in rounds in the LOCAL model. We show
that a rather rudimentary looking graph coloring problem is complete in the
above sense: Color the nodes of a graph with colors red and blue such that each
node of sufficiently large polylogarithmic degree has at least one neighbor of
each color. The problem admits a trivial zero-round randomized solution. The
result can be viewed as showing that the only obstacle to getting efficient
determinstic algorithms in the LOCAL model is an efficient algorithm to
approximately round fractional values into integer values
Families of nested completely regular codes and distance-regular graphs
In this paper infinite families of linear binary nested completely regular
codes are constructed. They have covering radius equal to or ,
and are -th parts, for of binary (respectively,
extended binary) Hamming codes of length (respectively, ), where
. In the usual way, i.e., as coset graphs, infinite families of embedded
distance-regular coset graphs of diameter equal to or are
constructed. In some cases, the constructed codes are also completely
transitive codes and the corresponding coset graphs are distance-transitive
Highly saturated packings and reduced coverings
We introduce and study certain notions which might serve as substitutes for
maximum density packings and minimum density coverings. A body is a compact
connected set which is the closure of its interior. A packing with
congruent replicas of a body is -saturated if no members of it can
be replaced with replicas of , and it is completely saturated if it is
-saturated for each . Similarly, a covering with congruent
replicas of a body is -reduced if no members of it can be replaced
by replicas of without uncovering a portion of the space, and it is
completely reduced if it is -reduced for each . We prove that every
body in -dimensional Euclidean or hyperbolic space admits both an
-saturated packing and an -reduced covering with replicas of . Under
some assumptions on (somewhat weaker than convexity),
we prove the existence of completely saturated packings and completely reduced
coverings, but in general, the problem of existence of completely saturated
packings and completely reduced coverings remains unsolved. Also, we
investigate some problems related to the the densities of -saturated
packings and -reduced coverings. Among other things, we prove that there
exists an upper bound for the density of a -reduced covering of
with congruent balls, and we produce some density bounds for the
-saturated packings and -reduced coverings of the plane with congruent
circles
Estimation of instrinsic dimension via clustering
The problem of estimating the intrinsic dimension of a set of points in high dimensional space is a critical issue for a wide range of disciplines, including genomics, finance, and networking. Current estimation techniques are dependent on either the ambient or intrinsic dimension in terms of computational complexity, which may cause these methods to become intractable for large data sets. In this paper, we present a clustering-based methodology that exploits the inherent self-similarity of data to efficiently estimate the intrinsic dimension of a set of points. When the data satisfies a specified general clustering condition, we prove that the estimated dimension approaches the true Hausdorff dimension. Experiments show that the clustering-based approach allows for more efficient and accurate intrinsic dimension estimation compared with all prior techniques, even when the data does not conform to obvious self-similarity structure. Finally, we present empirical results which show the clustering-based estimation allows for a natural partitioning of the data points that lie on separate manifolds of varying intrinsic dimension
GraphCombEx: A Software Tool for Exploration of Combinatorial Optimisation Properties of Large Graphs
We present a prototype of a software tool for exploration of multiple
combinatorial optimisation problems in large real-world and synthetic complex
networks. Our tool, called GraphCombEx (an acronym of Graph Combinatorial
Explorer), provides a unified framework for scalable computation and
presentation of high-quality suboptimal solutions and bounds for a number of
widely studied combinatorial optimisation problems. Efficient representation
and applicability to large-scale graphs and complex networks are particularly
considered in its design. The problems currently supported include maximum
clique, graph colouring, maximum independent set, minimum vertex clique
covering, minimum dominating set, as well as the longest simple cycle problem.
Suboptimal solutions and intervals for optimal objective values are estimated
using scalable heuristics. The tool is designed with extensibility in mind,
with the view of further problems and both new fast and high-performance
heuristics to be added in the future. GraphCombEx has already been successfully
used as a support tool in a number of recent research studies using
combinatorial optimisation to analyse complex networks, indicating its promise
as a research software tool
- …