37,215 research outputs found
Query Complexity of Global Minimum Cut
In this work, we resolve the query complexity of global minimum cut problem for a graph by designing a randomized algorithm for approximating the size of minimum cut in a graph, where the graph can be accessed through local queries like Degree, Neighbor, and Adjacency queries.
Given ? ? (0,1), the algorithm with high probability outputs an estimate t? satisfying the following (1-?) t ? t? ? (1+?) t, where t is the size of minimum cut in the graph. The expected number of local queries used by our algorithm is min{m+n,m/t}poly(log n,1/(?)) where n and m are the number of vertices and edges in the graph, respectively. Eden and Rosenbaum showed that ?(m/t) local queries are required for approximating the size of minimum cut in graphs, {but no local query based algorithm was known. Our algorithmic result coupled with the lower bound of Eden and Rosenbaum [APPROX 2018] resolve the query complexity of the problem of estimating the size of minimum cut in graphs using local queries.}
Building on the lower bound of Eden and Rosenbaum, we show that, for all t ? ?, ?(m) local queries are required to decide if the size of the minimum cut in the graph is t or t-2. Also, we show that, for any t ? ?, ?(m) local queries are required to find all the minimum cut edges even if it is promised that the input graph has a minimum cut of size t. Both of our lower bound results are randomized, and hold even if we can make Random Edge queries in addition to local queries
A cut tree representation for pendant pairs
Two vertices v and w of a graph G are called a pendant pair if the maximal number of edge-disjoint paths in G between them is precisely min{d(v),d(w)}, where d denotes the degree function. The importance of pendant pairs stems from the fact that they are the key ingredient in one of the simplest and most widely used algorithms for the minimum cut problem today.
Mader showed 1974 that every simple graph with minimum degree delta contains Omega(delta^2) pendant pairs; this is the best bound known so far. We improve this result by showing that every simple graph G with minimum degree delta >= 5 or with edge-connectivity lambda >= 4 or with vertex-connectivity kappa >= 3 contains in fact Omega(delta |V|) pendant pairs. We prove that this bound is tight from several perspectives, and that Omega(delta |V|) pendant pairs can be computed efficiently, namely in linear time when a Gomory-Hu tree is given. Our method utilizes a new cut tree representation of graphs
MAX CUT in Weighted Random Intersection Graphs and Discrepancy of Sparse Random Set Systems
Let be a set of vertices, a set of labels, and let
be an matrix of independent Bernoulli random
variables with success probability . A random instance
of the weighted random intersection graph model
is constructed by drawing an edge with weight
between any two vertices for which this weight is larger than 0. In this
paper we study the average case analysis of Weighted Max Cut, assuming the
input is a weighted random intersection graph, i.e. given
we wish to find a partition of into two
sets so that the total weight of the edges having one endpoint in each set is
maximized. We initially prove concentration of the weight of a maximum cut of
around its expected value, and then show that,
when the number of labels is much smaller than the number of vertices, a random
partition of the vertices achieves asymptotically optimal cut weight with high
probability (whp). Furthermore, in the case and constant average degree,
we show that whp, a majority type algorithm outputs a cut with weight that is
larger than the weight of a random cut by a multiplicative constant strictly
larger than 1. Then, we highlight a connection between the computational
problem of finding a weighted maximum cut in
and the problem of finding a 2-coloring with minimum discrepancy for a set
system with incidence matrix . We exploit this connection
by proposing a (weak) bipartization algorithm for the case that, when it terminates, its output can be used to find
a 2-coloring with minimum discrepancy in . Finally, we prove that, whp
this 2-coloring corresponds to a bipartition with maximum cut-weight in
.Comment: 18 page
On the Parameterized Complexity of Sparsest Cut and Small-set Expansion Problems
We study the NP-hard \textsc{-Sparsest Cut} problem (SC) in which,
given an undirected graph and a parameter , the objective is to
partition vertex set into subsets whose maximum edge expansion is
minimized. Herein, the edge expansion of a subset is defined as
the sum of the weights of edges exiting divided by the number of vertices
in . Another problem that has been investigated is \textsc{-Small-Set
Expansion} problem (SSE), which aims to find a subset with minimum edge
expansion with a restriction on the size of the subset. We extend previous
studies on SC and SSE by inspecting their parameterized complexity. On
the positive side, we present two FPT algorithms for both SSE and 2SC
problems where in the first algorithm we consider the parameter treewidth of
the input graph and uses exponential space, and in the second we consider the
parameter vertex cover number of the input graph and uses polynomial space.
Moreover, we consider the unweighted version of the SC problem where is fixed and proposed two FPT algorithms with parameters treewidth and
vertex cover number of the input graph. We also propose a randomized FPT
algorithm for SSE when parameterized by and the maximum degree of the
input graph combined. Its derandomization is done efficiently.
\noindent On the negative side, first we prove that for every fixed integer
, the problem SC is NP-hard for graphs with vertex cover
number at most . We also show that SC is W[1]-hard when parameterized
by the treewidth of the input graph and the number~ of components combined
using a reduction from \textsc{Unary Bin Packing}. Furthermore, we prove that
SC remains NP-hard for graphs with maximum degree three and also graphs with
degeneracy two. Finally, we prove that the unweighted SSE is W[1]-hard for
the parameter
Parameterized Complexity of Multi-Node Hubs
Hubs are high-degree nodes within a network. The examination of the emergence and centrality of hubs lies at the heart of many studies of complex networks such as telecommunication networks, biological networks, social networks and semantic networks. Furthermore, identifying and allocating hubs are routine tasks in applications. In this paper, we do not seek a hub that is a single node, but a hub that consists of k nodes. Formally, given a graph G=(V,E), we a seek a set A subseteq V of size k that induces a connected subgraph from which at least p edges emanate. Thus, we identify k nodes which can act as a unit (due to the connectivity constraint) that is a hub (due to the cut constraint). This problem, which we call Multi-Node Hub (MNH), can also be viewed as a variant of the classic Max Cut problem. While it is easy to see that MNH is W[1]-hard with respect to the parameter k, our main contribution is the first parameterized algorithm that shows that MNH is FPT with respect to the parameter p.
Despite recent breakthrough advances for cut-problems like Multicut and Minimum Bisection, MNH is still very challenging. Not only does a connectivity constraint has to be handled on top of the involved machinery developed for these problems, but also the fact that MNH is a maximization problem seems to prevent the applicability of this machinery in the first place. To deal with the latter issue, we give non-trivial reduction rules that show how MNH can be preprocessed into a problem where it is necessary to delete a bounded-in-parameter number of vertices. Then, to handle the connectivity constraint, we use a novel application of the form of tree decomposition introduced by Cygan et al. [STOC 2014] to solve Minimum Bisection, where we demonstrate how connectivity constraints can be replaced by simpler size constraints. Our approach may be relevant to the design of algorithms for other cut-problems of this nature
- …