78 research outputs found

    The Parameterized Complexity of Degree Constrained Editing Problems

    Get PDF
    This thesis examines degree constrained editing problems within the framework of parameterized complexity. A degree constrained editing problem takes as input a graph and a set of constraints and asks whether the graph can be altered in at most k editing steps such that the degrees of the remaining vertices are within the given constraints. Parameterized complexity gives a framework for examining problems that are traditionally considered intractable and developing efficient exact algorithms for them, or showing that it is unlikely that they have such algorithms, by introducing an additional component to the input, the parameter, which gives additional information about the structure of the problem. If the problem has an algorithm that is exponential in the parameter, but polynomial, with constant degree, in the size of the input, then it is considered to be fixed-parameter tractable. Parameterized complexity also provides an intractability framework for identifying problems that are likely to not have such an algorithm. Degree constrained editing problems provide natural parameterizations in terms of the total cost k of vertex deletions, edge deletions and edge additions allowed, and the upper bound r on the degree of the vertices remaining after editing. We define a class of degree constrained editing problems, WDCE, which generalises several well know problems, such as Degree r Deletion, Cubic Subgraph, r-Regular Subgraph, f-Factor and General Factor. We show that in general if both k and r are part of the parameter, problems in the WDCE class are fixed-parameter tractable, and if parameterized by k or r alone, the problems are intractable in a parameterized sense. We further show cases of WDCE that have polynomial time kernelizations, and in particular when all the degree constraints are a single number and the editing operations include vertex deletion and edge deletion we show that there is a kernel with at most O(kr(k + r)) vertices. If we allow vertex deletion and edge addition, we show that despite remaining fixed-parameter tractable when parameterized by k and r together, the problems are unlikely to have polynomial sized kernelizations, or polynomial time kernelizations of a certain form, under certain complexity theoretic assumptions. We also examine a more general case where given an input graph the question is whether with at most k deletions the graph can be made r-degenerate. We show that in this case the problems are intractable, even when r is a constant

    Relational Expressions for Data Transformation and Computation

    Full text link
    Separate programming models for data transformation (declarative) and computation (procedural) impact programmer ergonomics, code reusability and database efficiency. To eliminate the necessity for two models or paradigms, we propose a small but high-leverage innovation: the introduction of complete relations into the relational database. Complete relations and the discipline of constraint programming, which concerns them, are founded on the same algebra as relational databases. We claim that by synthesising the relational database of Codd and Date, with the results of the constraint programming community, the relational model holistically offers programmers a single declarative paradigm for both data transformation and computation, reusable code with computations that are indifferent to what is input and what is output, and efficient applications with the query engine optimising and parallelising all levels of data transformation and computation.Comment: 12 pages, 4 tables. To be published in the proceedings of the Shepherding Track of the 2023 Australasian Database Conference Melbourne (Nov 1-3

    On the Feasibility of Maintenance Algorithms in Dynamic Graphs

    Full text link
    Near ubiquitous mobile computing has led to intense interest in dynamic graph theory. This provides a new and challenging setting for algorithmics and complexity theory. For any graph-based problem, the rapid evolution of a (possibly disconnected) graph over time naturally leads to the important complexity question: is it better to calculate a new solution from scratch or to adapt the known solution on the prior graph to quickly provide a solution of guaranteed quality for the changed graph? In this paper, we demonstrate that the former is the best approach in some cases, but that there are cases where the latter is feasible. We prove that, under certain conditions, hard problems cannot even be approximated in any reasonable complexity bound --- i.e., even with a large amount of time, having a solution to a very similar graph does not help in computing a solution to the current graph. To achieve this, we formalize the idea as a maintenance algorithm. Using r-Regular Subgraph as the primary example we show that W[1]-hardness for the parameterized approximation problem implies the non-existence of a maintenance algorithm for the given approximation ratio. Conversely we show that Vertex Cover, which is fixed-parameter tractable, has a 2-approximate maintenance algorithm. The implications of NP-hardness and NPO-hardness are also explored

    On the Parameterised Complexity of Induced Multipartite Graph Parameters

    Get PDF
    We introduce a family of graph parameters, called induced multipartite graph parameters, and study their computational complexity. First, we consider the following decision problem: an instance is an induced multipartite graph parameter pp and a given graph GG, and for natural numbers k2k\geq2 and \ell, we must decide whether the maximum value of pp over all induced kk-partite subgraphs of GG is at most \ell. We prove that this problem is W[1]-hard. Next, we consider a variant of this problem, where we must decide whether the given graph GG contains a sufficiently large induced kk-partite subgraph HH such that p(H)p(H)\leq\ell. We show that for certain parameters this problem is para-NP-hard, while for others it is fixed-parameter tractable.Comment: 9 pages, 0 figure

    On the Parameterised Complexity of Induced Multipartite Graph Parameters

    Full text link
    We introduce a family of graph parameters, called induced multipartite graph parameters, and study their computational complexity. First, we consider the following decision problem: an instance is an induced multipartite graph parameter pp and a given graph GG, and for natural numbers k2k\geq2 and \ell, we must decide whether the maximum value of pp over all induced kk-partite subgraphs of GG is at most \ell. We prove that this problem is W[1]-hard. Next, we consider a variant of this problem, where we must decide whether the given graph GG contains a sufficiently large induced kk-partite subgraph HH such that p(H)p(H)\leq\ell. We show that for certain parameters this problem is para-NP-hard, while for others it is fixed-parameter tractable.Comment: 9 pages, 0 figure

    Hierarchical Clustering Using the Arithmetic-Harmonic Cut: Complexity and Experiments

    Get PDF
    Clustering, particularly hierarchical clustering, is an important method for understanding and analysing data across a wide variety of knowledge domains with notable utility in systems where the data can be classified in an evolutionary context. This paper introduces a new hierarchical clustering problem defined by a novel objective function we call the arithmetic-harmonic cut. We show that the problem of finding such a cut is -hard and -hard but is fixed-parameter tractable, which indicates that although the problem is unlikely to have a polynomial time algorithm (even for approximation), exact parameterized and local search based techniques may produce workable algorithms. To this end, we implement a memetic algorithm for the problem and demonstrate the effectiveness of the arithmetic-harmonic cut on a number of datasets including a cancer type dataset and a corona virus dataset. We show favorable performance compared to currently used hierarchical clustering techniques such as -Means, Graclus and Normalized-Cut. The arithmetic-harmonic cut metric overcoming difficulties other hierarchal methods have in representing both intercluster differences and intracluster similarities

    A Kernelisation Approach for Multiple d-Hitting Set and Its Application in Optimal Multi-Drug Therapeutic Combinations

    Get PDF
    Therapies consisting of a combination of agents are an attractive proposition, especially in the context of diseases such as cancer, which can manifest with a variety of tumor types in a single case. However uncovering usable drug combinations is expensive both financially and temporally. By employing computational methods to identify candidate combinations with a greater likelihood of success we can avoid these problems, even when the amount of data is prohibitively large. Hitting Set is a combinatorial problem that has useful application across many fields, however as it is NP-complete it is traditionally considered hard to solve exactly. We introduce a more general version of the problem (α,β,d)-Hitting Set, which allows more precise control over how and what the hitting set targets. Employing the framework of Parameterized Complexity we show that despite being NP-complete, the (α,β,d)-Hitting Set problem is fixed-parameter tractable with a kernel of size O(αdkd) when we parameterize by the size k of the hitting set and the maximum number α of the minimum number of hits, and taking the maximum degree d of the target sets as a constant. We demonstrate the application of this problem to multiple drug selection for cancer therapy, showing the flexibility of the problem in tailoring such drug sets. The fixed-parameter tractability result indicates that for low values of the parameters the problem can be solved quickly using exact methods. We also demonstrate that the problem is indeed practical, with computation times on the order of 5 seconds, as compared to previous Hitting Set applications using the same dataset which exhibited times on the order of 1 day, even with relatively relaxed notions for what constitutes a low value for the parameters. Furthermore the existence of a kernelization for (α,β,d)-Hitting Set indicates that the problem is readily scalable to large datasets

    Effects of antiplatelet therapy on stroke risk by brain imaging features of intracerebral haemorrhage and cerebral small vessel diseases: subgroup analyses of the RESTART randomised, open-label trial

    Get PDF
    Background Findings from the RESTART trial suggest that starting antiplatelet therapy might reduce the risk of recurrent symptomatic intracerebral haemorrhage compared with avoiding antiplatelet therapy. Brain imaging features of intracerebral haemorrhage and cerebral small vessel diseases (such as cerebral microbleeds) are associated with greater risks of recurrent intracerebral haemorrhage. We did subgroup analyses of the RESTART trial to explore whether these brain imaging features modify the effects of antiplatelet therapy

    The Beaker phenomenon and the genomic transformation of northwest Europe

    Get PDF
    From around 2750 to 2500 bc, Bell Beaker pottery became widespread across western and central Europe, before it disappeared between 2200 and 1800 bc. The forces that propelled its expansion are a matter of long-standing debate, and there is support for both cultural diffusion and migration having a role in this process. Here we present genome-wide data from 400 Neolithic, Copper Age and Bronze Age Europeans, including 226 individuals associated with Beaker-complex artefacts. We detected limited genetic affinity between Beaker-complex-associated individuals from Iberia and central Europe, and thus exclude migration as an important mechanism of spread between these two regions. However, migration had a key role in the further dissemination of the Beaker complex. We document this phenomenon most clearly in Britain, where the spread of the Beaker complex introduced high levels of steppe-related ancestry and was associated with the replacement of approximately 90% of Britain’s gene pool within a few hundred years, continuing the east-to-west expansion that had brought steppe-related ancestry into central and northern Europe over the previous centuries
    corecore