8,265 research outputs found
On the Complexity of the Single Individual SNP Haplotyping Problem
We present several new results pertaining to haplotyping. These results
concern the combinatorial problem of reconstructing haplotypes from incomplete
and/or imperfectly sequenced haplotype fragments. We consider the complexity of
the problems Minimum Error Correction (MEC) and Longest Haplotype
Reconstruction (LHR) for different restrictions on the input data.
Specifically, we look at the gapless case, where every row of the input
corresponds to a gapless haplotype-fragment, and the 1-gap case, where at most
one gap per fragment is allowed. We prove that MEC is APX-hard in the 1-gap
case and still NP-hard in the gapless case. In addition, we question earlier
claims that MEC is NP-hard even when the input matrix is restricted to being
completely binary. Concerning LHR, we show that this problem is NP-hard and
APX-hard in the 1-gap case (and thus also in the general case), but is
polynomial time solvable in the gapless case.Comment: 26 pages. Related to the WABI2005 paper, "On the Complexity of
Several Haplotyping Problems", but with more/different results. This papers
has just been submitted to the IEEE/ACM Transactions on Computational Biology
and Bioinformatics and we are awaiting a decision on acceptance. It differs
from the mid-August version of this paper because here we prove that 1-gap
LHR is APX-hard. (In the earlier version of the paper we could prove only
that it was NP-hard.
Generalized Network Dismantling
Finding the set of nodes, which removed or (de)activated can stop the spread
of (dis)information, contain an epidemic or disrupt the functioning of a
corrupt/criminal organization is still one of the key challenges in network
science. In this paper, we introduce the generalized network dismantling
problem, which aims to find the set of nodes that, when removed from a network,
results in a network fragmentation into subcritical network components at
minimum cost. For unit costs, our formulation becomes equivalent to the
standard network dismantling problem. Our non-unit cost generalization allows
for the inclusion of topological cost functions related to node centrality and
non-topological features such as the price, protection level or even social
value of a node. In order to solve this optimization problem, we propose a
method, which is based on the spectral properties of a novel node-weighted
Laplacian operator. The proposed method is applicable to large-scale networks
with millions of nodes. It outperforms current state-of-the-art methods and
opens new directions in understanding the vulnerability and robustness of
complex systems.Comment: 6 pages, 5 figure
Efficient, Superstabilizing Decentralised Optimisation for Dynamic Task Allocation Environments
Decentralised optimisation is a key issue for multi-agent systems, and while many solution techniques have been developed, few provide support for dynamic environments, which change over time, such as disaster management. Given this, in this paper, we present Bounded Fast Max Sum (BFMS): a novel, dynamic, superstabilizing algorithm which provides a bounded approximate solution to certain classes of distributed constraint optimisation problems. We achieve this by eliminating dependencies in the constraint functions, according to how much impact they have on the overall solution value. In more detail, we propose iGHS, which computes a maximum spanning tree on subsections of the constraint graph, in order to reduce communication and computation overheads. Given this, we empirically evaluate BFMS, which shows that BFMS reduces communication and computation done by Bounded Max Sum by up to 99%, while obtaining 60-88% of the optimal utility
- …