404 research outputs found

    FPT Approximation and Subexponential Algorithms for Covering Few or Many Edges

    Get PDF
    We study the ?-Fixed Cardinality Graph Partitioning (?-FCGP) problem, the generic local graph partitioning problem introduced by Bonnet et al. [Algorithmica 2015]. In this problem, we are given a graph G, two numbers k,p and 0 ? ? ? 1, the question is whether there is a set S ? V of size k with a specified coverage function cov_?(S) at least p (or at most p for the minimization version). The coverage function cov_?(?) counts edges with exactly one endpoint in S with weight ? and edges with both endpoints in S with weight 1 - ?. ?-FCGP generalizes a number of fundamental graph problems such as Densest k-Subgraph, Max k-Vertex Cover, and Max (k,n-k)-Cut. A natural question in the study of ?-FCGP is whether the algorithmic results known for its special cases, like Max k-Vertex Cover, could be extended to more general settings. One of the simple but powerful methods for obtaining parameterized approximation [Manurangsi, SOSA 2019] and subexponential algorithms [Fomin et al. IPL 2011] for Max k-Vertex Cover is based on the greedy vertex degree orderings. The main insight of our work is that the idea of greed vertex degree ordering could be used to design fixed-parameter approximation schemes (FPT-AS) for ? > 0 and the subexponential-time algorithms for the problem on apex-minor free graphs for maximization with ? > 1/3 and minimization with ? < 1/3

    Seventh Biennial Report : June 2003 - March 2005

    No full text

    Hardness of Easy Problems: Basing Hardness on Popular Conjectures such as the Strong Exponential Time Hypothesis (Invited Talk)

    Get PDF
    Algorithmic research strives to develop fast algorithms for fundamental problems. Despite its many successes, however, many problems still do not have very efficient algorithms. For years researchers have explained the hardness for key problems by proving NP-hardness, utilizing polynomial time reductions to base the hardness of key problems on the famous conjecture P != NP. For problems that already have polynomial time algorithms, however, it does not seem that one can show any sort of hardness based on P != NP. Nevertheless, we would like to provide evidence that a problem AA with a running time O(n^k) that has not been improved in decades, also requires n^{k-o(1)} time, thus explaining the lack of progress on the problem. Such unconditional time lower bounds seem very difficult to obtain, unfortunately. Recent work has concentrated on an approach mimicking NP-hardness: (1) select a few key problems that are conjectured to require T(n) time to solve, (2) use special, fine-grained reductions to prove time lower bounds for many diverse problems in P based on the conjectured hardness of the key problems. In this abstract we outline the approach, give some examples of hardness results based on the Strong Exponential Time Hypothesis, and present an overview of some of the recent work on the topic

    Counting Problems in Parameterized Complexity

    Get PDF
    This survey is an invitation to parameterized counting problems for readers with a background in parameterized algorithms and complexity. After an introduction to the peculiarities of counting complexity, we survey the parameterized approach to counting problems, with a focus on two topics of recent interest: Counting small patterns in large graphs, and counting perfect matchings and Hamiltonian cycles in well-structured graphs. While this survey presupposes familiarity with parameterized algorithms and complexity, we aim at explaining all relevant notions from counting complexity in a self-contained way
    • …
    corecore