48,845 research outputs found
On Strong Diameter Padded Decompositions
Given a weighted graph G=(V,E,w), a partition of V is Delta-bounded if the diameter of each cluster is bounded by Delta. A distribution over Delta-bounded partitions is a beta-padded decomposition if every ball of radius gamma Delta is contained in a single cluster with probability at least e^{-beta * gamma}. The weak diameter of a cluster C is measured w.r.t. distances in G, while the strong diameter is measured w.r.t. distances in the induced graph G[C]. The decomposition is weak/strong according to the diameter guarantee.
Formerly, it was proven that K_r free graphs admit weak decompositions with padding parameter O(r), while for strong decompositions only O(r^2) padding parameter was known. Furthermore, for the case of a graph G, for which the induced shortest path metric d_G has doubling dimension ddim, a weak O(ddim)-padded decomposition was constructed, which is also known to be tight. For the case of strong diameter, nothing was known.
We construct strong O(r)-padded decompositions for K_r free graphs, matching the state of the art for weak decompositions. Similarly, for graphs with doubling dimension ddim we construct a strong O(ddim)-padded decomposition, which is also tight. We use this decomposition to construct (O(ddim),O~(ddim))-sparse cover scheme for such graphs. Our new decompositions and cover have implications to approximating unique games, the construction of light and sparse spanners, and for path reporting distance oracles
Partitioning random graphs into monochromatic components
Erd\H{o}s, Gy\'arf\'as, and Pyber (1991) conjectured that every -colored
complete graph can be partitioned into at most monochromatic components;
this is a strengthening of a conjecture of Lov\'asz (1975) in which the
components are only required to form a cover. An important partial result of
Haxell and Kohayakawa (1995) shows that a partition into monochromatic
components is possible for sufficiently large -colored complete graphs.
We start by extending Haxell and Kohayakawa's result to graphs with large
minimum degree, then we provide some partial analogs of their result for random
graphs. In particular, we show that if , then a.a.s. in every -coloring of there exists
a partition into two monochromatic components, and for if , then a.a.s. there exists an -coloring
of such that there does not exist a cover with a bounded number of
components. Finally, we consider a random graph version of a classic result of
Gy\'arf\'as (1977) about large monochromatic components in -colored complete
graphs. We show that if , then a.a.s. in every
-coloring of there exists a monochromatic component of order at
least .Comment: 27 pages, 2 figures. Appears in Electronic Journal of Combinatorics
Volume 24, Issue 1 (2017) Paper #P1.1
Optimal covers with Hamilton cycles in random graphs
A packing of a graph G with Hamilton cycles is a set of edge-disjoint
Hamilton cycles in G. Such packings have been studied intensively and recent
results imply that a largest packing of Hamilton cycles in G_n,p a.a.s. has
size \lfloor delta(G_n,p) /2 \rfloor. Glebov, Krivelevich and Szab\'o recently
initiated research on the `dual' problem, where one asks for a set of Hamilton
cycles covering all edges of G. Our main result states that for log^{117}n / n
< p < 1-n^{-1/8}, a.a.s. the edges of G_n,p can be covered by \lceil
Delta(G_n,p)/2 \rceil Hamilton cycles. This is clearly optimal and improves an
approximate result of Glebov, Krivelevich and Szab\'o, which holds for p >
n^{-1+\eps}. Our proof is based on a result of Knox, K\"uhn and Osthus on
packing Hamilton cycles in pseudorandom graphs.Comment: final version of paper (to appear in Combinatorica
High-Dimensional Gaussian Graphical Model Selection: Walk Summability and Local Separation Criterion
We consider the problem of high-dimensional Gaussian graphical model
selection. We identify a set of graphs for which an efficient estimation
algorithm exists, and this algorithm is based on thresholding of empirical
conditional covariances. Under a set of transparent conditions, we establish
structural consistency (or sparsistency) for the proposed algorithm, when the
number of samples n=omega(J_{min}^{-2} log p), where p is the number of
variables and J_{min} is the minimum (absolute) edge potential of the graphical
model. The sufficient conditions for sparsistency are based on the notion of
walk-summability of the model and the presence of sparse local vertex
separators in the underlying graph. We also derive novel non-asymptotic
necessary conditions on the number of samples required for sparsistency
A More Reliable Greedy Heuristic for Maximum Matchings in Sparse Random Graphs
We propose a new greedy algorithm for the maximum cardinality matching
problem. We give experimental evidence that this algorithm is likely to find a
maximum matching in random graphs with constant expected degree c>0,
independent of the value of c. This is contrary to the behavior of commonly
used greedy matching heuristics which are known to have some range of c where
they probably fail to compute a maximum matching
Low-Density Code-Domain NOMA: Better Be Regular
A closed-form analytical expression is derived for the limiting empirical
squared singular value density of a spreading (signature) matrix corresponding
to sparse low-density code-domain (LDCD) non-orthogonal multiple-access (NOMA)
with regular random user-resource allocation. The derivation relies on
associating the spreading matrix with the adjacency matrix of a large
semiregular bipartite graph. For a simple repetition-based sparse spreading
scheme, the result directly follows from a rigorous analysis of spectral
measures of infinite graphs. Turning to random (sparse) binary spreading, we
harness the cavity method from statistical physics, and show that the limiting
spectral density coincides in both cases. Next, we use this density to compute
the normalized input-output mutual information of the underlying vector channel
in the large-system limit. The latter may be interpreted as the achievable
total throughput per dimension with optimum processing in a corresponding
multiple-access channel setting or, alternatively, in a fully-symmetric
broadcast channel setting with full decoding capabilities at each receiver.
Surprisingly, the total throughput of regular LDCD-NOMA is found to be not only
superior to that achieved with irregular user-resource allocation, but also to
the total throughput of dense randomly-spread NOMA, for which optimum
processing is computationally intractable. In contrast, the superior
performance of regular LDCD-NOMA can be potentially achieved with a feasible
message-passing algorithm. This observation may advocate employing regular,
rather than irregular, LDCD-NOMA in 5G cellular physical layer design.Comment: Accepted for publication in the IEEE International Symposium on
Information Theory (ISIT), June 201
Partitioning networks into cliques: a randomized heuristic approach
In the context of community detection in social networks, the term community can be grounded in the strict way that simply everybody should know each other within the community. We consider the corresponding community detection problem. We search for a partitioning of a network into the minimum number of non-overlapping cliques, such that the cliques cover all vertices. This problem is called the clique covering problem (CCP) and is one of the classical NP-hard problems. For CCP, we propose a randomized heuristic approach. To construct a high quality solution to CCP, we present an iterated greedy (IG) algorithm. IG can also be combined with a heuristic used to determine how far the algorithm is from the optimum in the worst case. Randomized local search (RLS) for maximum independent set was proposed to find such a bound. The experimental results of IG and the bounds obtained by RLS indicate that IG is a very suitable technique for solving CCP in real-world graphs. In addition, we summarize our basic rigorous results, which were developed for analysis of IG and understanding of its behavior on several relevant graph classes
- …