72 research outputs found

    Vertex Cover Kernelization Revisited: Upper and Lower Bounds for a Refined Parameter

    Get PDF
    An important result in the study of polynomial-time preprocessing shows that there is an algorithm which given an instance (G,k) of Vertex Cover outputs an equivalent instance (G',k') in polynomial time with the guarantee that G' has at most 2k' vertices (and thus O((k')^2) edges) with k' <= k. Using the terminology of parameterized complexity we say that k-Vertex Cover has a kernel with 2k vertices. There is complexity-theoretic evidence that both 2k vertices and Theta(k^2) edges are optimal for the kernel size. In this paper we consider the Vertex Cover problem with a different parameter, the size fvs(G) of a minimum feedback vertex set for G. This refined parameter is structurally smaller than the parameter k associated to the vertex covering number vc(G) since fvs(G) <= vc(G) and the difference can be arbitrarily large. We give a kernel for Vertex Cover with a number of vertices that is cubic in fvs(G): an instance (G,X,k) of Vertex Cover, where X is a feedback vertex set for G, can be transformed in polynomial time into an equivalent instance (G',X',k') such that |V(G')| <= 2k and |V(G')| <= O(|X'|^3). A similar result holds when the feedback vertex set X is not given along with the input. In sharp contrast we show that the Weighted Vertex Cover problem does not have a polynomial kernel when parameterized by the cardinality of a given vertex cover of the graph unless NP is in coNP/poly and the polynomial hierarchy collapses to the third level.Comment: Published in "Theory of Computing Systems" as an Open Access publicatio

    A survey of parameterized algorithms and the complexity of edge modification

    Get PDF
    The survey is a comprehensive overview of the developing area of parameterized algorithms for graph modification problems. It describes state of the art in kernelization, subexponential algorithms, and parameterized complexity of graph modification. The main focus is on edge modification problems, where the task is to change some adjacencies in a graph to satisfy some required properties. To facilitate further research, we list many open problems in the area.publishedVersio

    Polynomial Kernels for Weighted Problems

    Full text link
    Kernelization is a formalization of efficient preprocessing for NP-hard problems using the framework of parameterized complexity. Among open problems in kernelization it has been asked many times whether there are deterministic polynomial kernelizations for Subset Sum and Knapsack when parameterized by the number nn of items. We answer both questions affirmatively by using an algorithm for compressing numbers due to Frank and Tardos (Combinatorica 1987). This result had been first used by Marx and V\'egh (ICALP 2013) in the context of kernelization. We further illustrate its applicability by giving polynomial kernels also for weighted versions of several well-studied parameterized problems. Furthermore, when parameterized by the different item sizes we obtain a polynomial kernelization for Subset Sum and an exponential kernelization for Knapsack. Finally, we also obtain kernelization results for polynomial integer programs

    Fixed-parameter algorithms for minimum-cost edge-connectivity augmentation

    Get PDF
    We consider connectivity-augmentation problems in a setting where each potential new edge has a non-negative cost associated with it, and the task is to achieve a certain connectivity target with at most p new edges of minimum total cost. The main result is that the minimum cost augmentation of edge-connectivity from k − 1 to k with at most p new edges is fixed-parameter tractable parameterized by p and admits a polynomial kernel. We also prove the fixed-parameter tractability of increasing edge connectivity from 0 to 2 and increasing node connectivity from 1 to 2

    Path-Contractions, Edge Deletions and Connectivity Preservation

    Get PDF
    We study several problems related to graph modification problems under connectivity constraints from the perspective of parameterized complexity: {\sc (Weighted) Biconnectivity Deletion}, where we are tasked with deleting~kk edges while preserving biconnectivity in an undirected graph, {\sc Vertex-deletion Preserving Strong Connectivity}, where we want to maintain strong connectivity of a digraph while deleting exactly~kk vertices, and {\sc Path-contraction Preserving Strong Connectivity}, in which the operation of path contraction on arcs is used instead. The parameterized tractability of this last problem was posed by Bang-Jensen and Yeo [DAM 2008] as an open question and we answer it here in the negative: both variants of preserving strong connectivity are W[1]\sf W[1]-hard. Preserving biconnectivity, on the other hand, turns out to be fixed parameter tractable and we provide a 2O(klogk)nO(1)2^{O(k\log k)} n^{O(1)}-algorithm that solves {\sc Weighted Biconnectivity Deletion}. Further, we show that the unweighted case even admits a randomized polynomial kernel. All our results provide further interesting data points for the systematic study of connectivity-preservation constraints in the parameterized setting

    Balanced Crown Decomposition for Connectivity Constraints

    Get PDF
    We introduce the balanced crown decomposition that captures the structure imposed on graphs by their connected induced subgraphs of a given size. Such subgraphs are a popular modeling tool in various application areas, where the non-local nature of the connectivity condition usually results in very challenging algorithmic tasks. The balanced crown decomposition is a combination of a crown decomposition and a balanced partition which makes it applicable to graph editing as well as graph packing and partitioning problems. We illustrate this by deriving improved approximation algorithms and kernelization for a variety of such problems. In particular, through this structure, we obtain the first constant-factor approximation for the Balanced Connected Partition (BCP) problem, where the task is to partition a vertex-weighted graph into k connected components of approximately equal weight. We derive a 3-approximation for the two most commonly used objectives of maximizing the weight of the lightest component or minimizing the weight of the heaviest component

    Preprocessing under uncertainty

    Get PDF
    In this work we study preprocessing for tractable problems when part of the input is unknown or uncertain. This comes up naturally if, e.g., the load of some machines or the congestion of some roads is not known far enough in advance, or if we have to regularly solve a problem over instances that are largely similar, e.g., daily airport scheduling with few charter flights. Unlike robust optimization, which also studies settings like this, our goal lies not in computing solutions that are (approximately) good for every instantiation. Rather, we seek to preprocess the known parts of the input, to speed up finding an optimal solution once the missing data is known. We present efficient algorithms that given an instance with partially uncertain input generate an instance of size polynomial in the amount of uncertain data that is equivalent for every instantiation of the unknown part. Concretely, we obtain such algorithms for Minimum Spanning Tree, Minimum Weight Matroid Basis, and Maximum Cardinality Bipartite Maxing, where respectively the weight of edges, weight of elements, and the availability of vertices is unknown for part of the input. Furthermore, we show that there are tractable problems, such as Small Connected Vertex Cover, for which one cannot hope to obtain similar results.Comment: 18 page
    corecore