5,758 research outputs found

    An ant colony algorithm for the sequential testing problem under precedence constraints.

    Get PDF
    We consider the problem of minimum cost sequential testing of a series (parallel) system under precedence constraints that can be modeled as a nonlinear integer program. We develop and implement an ant colony algorithm for the problem. We demonstrate the performance of this algorithm for special type of instances for which the optimal solutions can be found in polynomial time. In addition, we compare the performance of the algorithm with a special branch and bound algorithm for general instances. The ant colony algorithm is shown to be particularly effective for larger instances of the problem

    On critical service recovery after massive network failures

    Get PDF
    This paper addresses the problem of efficiently restoring sufficient resources in a communications network to support the demand of mission critical services after a large-scale disruption. We give a formulation of the problem as a mixed integer linear programming and show that it is NP-hard. We propose a polynomial time heuristic, called iterative split and prune (ISP) that decomposes the original problem recursively into smaller problems, until it determines the set of network components to be restored. ISP's decisions are guided by the use of a new notion of demand-based centrality of nodes. We performed extensive simulations by varying the topologies, the demand intensity, the number of critical services, and the disruption model. Compared with several greedy approaches, ISP performs better in terms of total cost of repaired components, and does not result in any demand loss. It performs very close to the optimal when the demand is low with respect to the supply network capacities, thanks to the ability of the algorithm to maximize sharing of repaired resources

    Network recovery after massive failures

    Get PDF
    This paper addresses the problem of efficiently restoring sufficient resources in a communications network to support the demand of mission critical services after a large scale disruption. We give a formulation of the problem as an MILP and show that it is NP-hard. We propose a polynomial time heuristic, called Iterative Split and Prune (ISP) that decomposes the original problem recursively into smaller problems, until it determines the set of network components to be restored. We performed extensive simulations by varying the topologies, the demand intensity, the number of critical services, and the disruption model. Compared to several greedy approaches ISP performs better in terms of number of repaired components, and does not result in any demand loss. It performs very close to the optimal when the demand is low with respect to the supply network capacities, thanks to the ability of the algorithm to maximize sharing of repaired resources

    Fast Algorithms for Constructing Maximum Entropy Summary Trees

    Full text link
    Karloff? and Shirley recently proposed summary trees as a new way to visualize large rooted trees (Eurovis 2013) and gave algorithms for generating a maximum-entropy k-node summary tree of an input n-node rooted tree. However, the algorithm generating optimal summary trees was only pseudo-polynomial (and worked only for integral weights); the authors left open existence of a olynomial-time algorithm. In addition, the authors provided an additive approximation algorithm and a greedy heuristic, both working on real weights. This paper shows how to construct maximum entropy k-node summary trees in time O(k^2 n + n log n) for real weights (indeed, as small as the time bound for the greedy heuristic given previously); how to speed up the approximation algorithm so that it runs in time O(n + (k^4/eps?) log(k/eps?)), and how to speed up the greedy algorithm so as to run in time O(kn + n log n). Altogether, these results make summary trees a much more practical tool than before.Comment: 17 pages, 4 figures. Extended version of paper appearing in ICALP 201

    Entity-Linking via Graph-Distance Minimization

    Get PDF
    Entity-linking is a natural-language-processing task that consists in identifying the entities mentioned in a piece of text, linking each to an appropriate item in some knowledge base; when the knowledge base is Wikipedia, the problem comes to be known as wikification (in this case, items are wikipedia articles). One instance of entity-linking can be formalized as an optimization problem on the underlying concept graph, where the quantity to be optimized is the average distance between chosen items. Inspired by this application, we define a new graph problem which is a natural variant of the Maximum Capacity Representative Set. We prove that our problem is NP-hard for general graphs; nonetheless, under some restrictive assumptions, it turns out to be solvable in linear time. For the general case, we propose two heuristics: one tries to enforce the above assumptions and another one is based on the notion of hitting distance; we show experimentally how these approaches perform with respect to some baselines on a real-world dataset.Comment: In Proceedings GRAPHITE 2014, arXiv:1407.7671. The second and third authors were supported by the EU-FET grant NADINE (GA 288956
    • …
    corecore