409 research outputs found

    Constructing computer virus phylogenies

    Get PDF
    There has been much recent algorithmic work on the problem of reconstructing the evolutionary history of biological species. Computer virus specialists are interested in finding the evolutionary history of computer viruses - a virus is often written using code fragments from one or more other viruses, which are its immediate ancestors. A phylogeny for a collection of computer viruses is a directed acyclic graph whose nodes are the viruses and whose edges map ancestors to descendants and satisfy the property that each code fragment is "invented" only once. To provide a simple explanation for the data, we consider the problem of constructing such a phylogeny with a minimum number of edges. In general this optimization problem is NP-complete; some associated approximation problems are also hard, but others are easy. When tree solutions exist, they can be constructed and randomly sampled in polynomial time

    Probing a Set of Trajectories to Maximize Captured Information

    Get PDF
    We study a trajectory analysis problem we call the Trajectory Capture Problem (TCP), in which, for a given input set T of trajectories in the plane, and an integer k? 2, we seek to compute a set of k points ("portals") to maximize the total weight of all subtrajectories of T between pairs of portals. This problem naturally arises in trajectory analysis and summarization. We show that the TCP is NP-hard (even in very special cases) and give some first approximation results. Our main focus is on attacking the TCP with practical algorithm-engineering approaches, including integer linear programming (to solve instances to provable optimality) and local search methods. We study the integrality gap arising from such approaches. We analyze our methods on different classes of data, including benchmark instances that we generate. Our goal is to understand the best performing heuristics, based on both solution time and solution quality. We demonstrate that we are able to compute provably optimal solutions for real-world instances

    Tolerating the Community Detection Resolution Limit with Edge Weighting

    Full text link
    Communities of vertices within a giant network such as the World-Wide Web are likely to be vastly smaller than the network itself. However, Fortunato and Barth\'{e}lemy have proved that modularity maximization algorithms for community detection may fail to resolve communities with fewer than L/2\sqrt{L/2} edges, where LL is the number of edges in the entire network. This resolution limit leads modularity maximization algorithms to have notoriously poor accuracy on many real networks. Fortunato and Barth\'{e}lemy's argument can be extended to networks with weighted edges as well, and we derive this corollary argument. We conclude that weighted modularity algorithms may fail to resolve communities with fewer than Wϵ/2\sqrt{W \epsilon/2} total edge weight, where WW is the total edge weight in the network and ϵ\epsilon is the maximum weight of an inter-community edge. If ϵ\epsilon is small, then small communities can be resolved. Given a weighted or unweighted network, we describe how to derive new edge weights in order to achieve a low ϵ\epsilon, we modify the ``CNM'' community detection algorithm to maximize weighted modularity, and show that the resulting algorithm has greatly improved accuracy. In experiments with an emerging community standard benchmark, we find that our simple CNM variant is competitive with the most accurate community detection methods yet proposed.Comment: revision with 8 pages 3 figures 2 table

    Implications of Electronics Constraints for Solid-State Quantum Error Correction and Quantum Circuit Failure Probability

    Full text link
    In this paper we present the impact of classical electronics constraints on a solid-state quantum dot logical qubit architecture. Constraints due to routing density, bandwidth allocation, signal timing, and thermally aware placement of classical supporting electronics significantly affect the quantum error correction circuit's error rate. We analyze one level of a quantum error correction circuit using nine data qubits in a Bacon-Shor code configured as a quantum memory. A hypothetical silicon double quantum dot quantum bit (qubit) is used as the fundamental element. A pessimistic estimate of the error probability of the quantum circuit is calculated using the total number of gates and idle time using a provably optimal schedule for the circuit operations obtained with an integer program methodology. The micro-architecture analysis provides insight about the different ways the electronics impact the circuit performance (e.g., extra idle time in the schedule), which can significantly limit the ultimate performance of any quantum circuit and therefore is a critical foundation for any future larger scale architecture analysis.Comment: 10 pages, 7 figures, 3 table
    • …
    corecore