18,157 research outputs found

    On the complexity of computing the kk-restricted edge-connectivity of a graph

    Full text link
    The \emph{kk-restricted edge-connectivity} of a graph GG, denoted by λk(G)\lambda_k(G), is defined as the minimum size of an edge set whose removal leaves exactly two connected components each containing at least kk vertices. This graph invariant, which can be seen as a generalization of a minimum edge-cut, has been extensively studied from a combinatorial point of view. However, very little is known about the complexity of computing λk(G)\lambda_k(G). Very recently, in the parameterized complexity community the notion of \emph{good edge separation} of a graph has been defined, which happens to be essentially the same as the kk-restricted edge-connectivity. Motivated by the relevance of this invariant from both combinatorial and algorithmic points of view, in this article we initiate a systematic study of its computational complexity, with special emphasis on its parameterized complexity for several choices of the parameters. We provide a number of NP-hardness and W[1]-hardness results, as well as FPT-algorithms.Comment: 16 pages, 4 figure

    On the Complexity of tt-Closeness Anonymization and Related Problems

    Full text link
    An important issue in releasing individual data is to protect the sensitive information from being leaked and maliciously utilized. Famous privacy preserving principles that aim to ensure both data privacy and data integrity, such as kk-anonymity and ll-diversity, have been extensively studied both theoretically and empirically. Nonetheless, these widely-adopted principles are still insufficient to prevent attribute disclosure if the attacker has partial knowledge about the overall sensitive data distribution. The tt-closeness principle has been proposed to fix this, which also has the benefit of supporting numerical sensitive attributes. However, in contrast to kk-anonymity and ll-diversity, the theoretical aspect of tt-closeness has not been well investigated. We initiate the first systematic theoretical study on the tt-closeness principle under the commonly-used attribute suppression model. We prove that for every constant tt such that 0≤t<10\leq t<1, it is NP-hard to find an optimal tt-closeness generalization of a given table. The proof consists of several reductions each of which works for different values of tt, which together cover the full range. To complement this negative result, we also provide exact and fixed-parameter algorithms. Finally, we answer some open questions regarding the complexity of kk-anonymity and ll-diversity left in the literature.Comment: An extended abstract to appear in DASFAA 201
    • …
    corecore