176 research outputs found

    On minimum tt-claw deletion in split graphs

    Full text link
    For t≥3t\geq 3, K1,tK_{1, t} is called tt-claw. In minimum tt-claw deletion problem (\texttt{Min-tt-Claw-Del}), given a graph G=(V,E)G=(V, E), it is required to find a vertex set SS of minimum size such that G[V∖S]G[V\setminus S] is tt-claw free. In a split graph, the vertex set is partitioned into two sets such that one forms a clique and the other forms an independent set. Every tt-claw in a split graph has a center vertex in the clique partition. This observation motivates us to consider the minimum one-sided bipartite tt-claw deletion problem (\texttt{Min-tt-OSBCD}). Given a bipartite graph G=(A∪B,E)G=(A \cup B, E), in \texttt{Min-tt-OSBCD} it is asked to find a vertex set SS of minimum size such that G[V∖S]G[V \setminus S] has no tt-claw with the center vertex in AA. A primal-dual algorithm approximates \texttt{Min-tt-OSBCD} within a factor of tt. We prove that it is \UGC-hard to approximate with a factor better than tt. We also prove it is approximable within a factor of 2 for dense bipartite graphs. By using these results on \texttt{Min-tt-OSBCD}, we prove that \texttt{Min-tt-Claw-Del} is \UGC-hard to approximate within a factor better than tt, for split graphs. We also consider their complementary maximization problems and prove that they are \APX-complete.Comment: 11 pages and 1 figur

    Extremal results in sparse pseudorandom graphs

    Get PDF
    Szemer\'edi's regularity lemma is a fundamental tool in extremal combinatorics. However, the original version is only helpful in studying dense graphs. In the 1990s, Kohayakawa and R\"odl proved an analogue of Szemer\'edi's regularity lemma for sparse graphs as part of a general program toward extending extremal results to sparse graphs. Many of the key applications of Szemer\'edi's regularity lemma use an associated counting lemma. In order to prove extensions of these results which also apply to sparse graphs, it remained a well-known open problem to prove a counting lemma in sparse graphs. The main advance of this paper lies in a new counting lemma, proved following the functional approach of Gowers, which complements the sparse regularity lemma of Kohayakawa and R\"odl, allowing us to count small graphs in regular subgraphs of a sufficiently pseudorandom graph. We use this to prove sparse extensions of several well-known combinatorial theorems, including the removal lemmas for graphs and groups, the Erd\H{o}s-Stone-Simonovits theorem and Ramsey's theorem. These results extend and improve upon a substantial body of previous work.Comment: 70 pages, accepted for publication in Adv. Mat

    Inapproximability of Maximum Edge Biclique, Maximum Balanced Biclique and Minimum k-Cut from the Small Set Expansion Hypothesis

    Get PDF
    The Small Set Expansion Hypothesis (SSEH) is a conjecture which roughly states that it is NP-hard to distinguish between a graph with a small set of vertices whose expansion is almost zero and one in which all small sets of vertices have expansion almost one. In this work, we prove conditional inapproximability results for the following graph problems based on this hypothesis: - Maximum Edge Biclique (MEB): given a bipartite graph G, find a complete bipartite subgraph of G with maximum number of edges. We show that, assuming SSEH and that NP != BPP, no polynomial time algorithm gives n^{1 - epsilon}-approximation for MEB for every constant epsilon > 0. - Maximum Balanced Biclique (MBB): given a bipartite graph G, find a balanced complete bipartite subgraph of G with maximum number of vertices. Similar to MEB, we prove n^{1 - epsilon} ratio inapproximability for MBB for every epsilon > 0, assuming SSEH and that NP != BPP. - Minimum k-Cut: given a weighted graph G, find a set of edges with minimum total weight whose removal splits the graph into k components. We prove that this problem is NP-hard to approximate to within (2 - epsilon) factor of the optimum for every epsilon > 0, assuming SSEH. The ratios in our results are essentially tight since trivial algorithms give n-approximation to both MEB and MBB and 2-approximation algorithms are known for Minimum k-Cut [Saran and Vazirani, SIAM J. Comput., 1995]. Our first two results are proved by combining a technique developed by Raghavendra, Steurer and Tulsiani [Raghavendra et al., CCC, 2012] to avoid locality of gadget reductions with a generalization of Bansal and Khot\u27s long code test [Bansal and Khot, FOCS, 2009] whereas our last result is shown via an elementary reduction

    Computational Complexity And Algorithms For Dirty Data Evaluation And Repairing

    Get PDF
    In this dissertation, we study the dirty data evaluation and repairing problem in relational database. Dirty data is usually inconsistent, inaccurate, incomplete and stale. Existing methods and theories of consistency describe using integrity constraints, such as data dependencies. However, integrity constraints are good at detection but not at evaluating the degree of data inconsistency and cannot guide the data repairing. This dissertation first studies the computational complexity of and algorithms for the database inconsistency evaluation. We define and use the minimum tuple deletion to evaluate the database inconsistency. For such minimum tuple deletion problem, we study the relationship between the size of rule set and its computational complexity. We show that the minimum tuple deletion problem is NP-hard to approximate the minimum tuple deletion within 17/16 if given three functional dependencies and four attributes involved. A near optimal approximated algorithm for computing the minimum tuple deletion is proposed with a ratio of 2 − 1/2r , where r is the number of given functional dependencies. To guide the data repairing, this dissertation also investigates the data repairing method by using query feedbacks, formally studies two decision problems, functional dependency restricted deletion and insertion propagation problem, corresponding to the feedbacks of deletion and insertion. A comprehensive analysis on both combined and data complexity of the cases is provided by considering different relational operators and feedback types. We have identified the intractable and tractable cases to picture the complexity hierarchy of these problems, and provided the efficient algorithm on these tractable cases. Two improvements are proposed, one focuses on figuring out the minimum vertex cover in conflict graph to improve the upper bound of tuple deletion problem, and the other one is a better dichotomy for deletion and insertion propagation problems at the absence of functional dependencies from the point of respectively considering data, combined and parameterized complexities
    • …
    corecore