801 research outputs found

    Spanning trees short or small

    Full text link
    We study the problem of finding small trees. Classical network design problems are considered with the additional constraint that only a specified number kk of nodes are required to be connected in the solution. A prototypical example is the kkMST problem in which we require a tree of minimum weight spanning at least kk nodes in an edge-weighted graph. We show that the kkMST problem is NP-hard even for points in the Euclidean plane. We provide approximation algorithms with performance ratio 2k2\sqrt{k} for the general edge-weighted case and O(k1/4)O(k^{1/4}) for the case of points in the plane. Polynomial-time exact solutions are also presented for the class of decomposable graphs which includes trees, series-parallel graphs, and bounded bandwidth graphs, and for points on the boundary of a convex region in the Euclidean plane. We also investigate the problem of finding short trees, and more generally, that of finding networks with minimum diameter. A simple technique is used to provide a polynomial-time solution for finding kk-trees of minimum diameter. We identify easy and hard problems arising in finding short networks using a framework due to T. C. Hu.Comment: 27 page

    Bicriteria Network Design Problems

    Full text link
    We study a general class of bicriteria network design problems. A generic problem in this class is as follows: Given an undirected graph and two minimization objectives (under different cost functions), with a budget specified on the first, find a <subgraph \from a given subgraph-class that minimizes the second objective subject to the budget on the first. We consider three different criteria - the total edge cost, the diameter and the maximum degree of the network. Here, we present the first polynomial-time approximation algorithms for a large class of bicriteria network design problems for the above mentioned criteria. The following general types of results are presented. First, we develop a framework for bicriteria problems and their approximations. Second, when the two criteria are the same %(note that the cost functions continue to be different) we present a ``black box'' parametric search technique. This black box takes in as input an (approximation) algorithm for the unicriterion situation and generates an approximation algorithm for the bicriteria case with only a constant factor loss in the performance guarantee. Third, when the two criteria are the diameter and the total edge costs we use a cluster-based approach to devise a approximation algorithms --- the solutions output violate both the criteria by a logarithmic factor. Finally, for the class of treewidth-bounded graphs, we provide pseudopolynomial-time algorithms for a number of bicriteria problems using dynamic programming. We show how these pseudopolynomial-time algorithms can be converted to fully polynomial-time approximation schemes using a scaling technique.Comment: 24 pages 1 figur

    Analysis of Performance of Dynamic Multicast Routing Algorithms

    Full text link
    In this paper, three new dynamic multicast routing algorithms based on the greedy tree technique are proposed; Source Optimised Tree, Topology Based Tree and Minimum Diameter Tree. A simulation analysis is presented showing various performance aspects of the algorithms, in which a comparison is made with the greedy and core based tree techniques. The effects of the tree source location on dynamic membership change are also examined. The simulations demonstrate that the Source Optimised Tree algorithm achieves a significant improvement in terms of delay and link usage when compared to the Core Based Tree, and greedy algorithm

    On the equivalence, containment, and covering problems for the regular and context-free languages

    Get PDF
    We consider the complexity of the equivalence and containment problems for regular expressions and context-free grammars, concentrating on the relationship between complexity and various language properties. Finiteness and boundedness of languages are shown to play important roles in the complexity of these problems. An encoding into grammars of Turing machine computations exponential in the size of the grammar is used to prove several exponential lower bounds. These lower bounds include exponential time for testing equivalence of grammars generating finite sets, and exponential space for testing equivalence of non-self-embedding grammars. Several problems which might be complex because of this encoding are shown to simplify for linear grammars. Other problems considered include grammatical covering and structural equivalence for right-linear, linear, and arbitrary grammars

    Worst case and probabilistic analysis of the 2-Opt algorithm for the TSP

    Get PDF
    2-Opt is probably the most basic local search heuristic for the TSP. This heuristic achieves amazingly good results on “real world” Euclidean instances both with respect to running time and approximation ratio. There are numerous experimental studies on the performance of 2-Opt. However, the theoretical knowledge about this heuristic is still very limited. Not even its worst case running time on 2-dimensional Euclidean instances was known so far. We clarify this issue by presenting, for every p∈N , a family of L p instances on which 2-Opt can take an exponential number of steps. Previous probabilistic analyses were restricted to instances in which n points are placed uniformly at random in the unit square [0,1]2, where it was shown that the expected number of steps is bounded by O~(n10) for Euclidean instances. We consider a more advanced model of probabilistic instances in which the points can be placed independently according to general distributions on [0,1] d , for an arbitrary d≄2. In particular, we allow different distributions for different points. We study the expected number of local improvements in terms of the number n of points and the maximal density ϕ of the probability distributions. We show an upper bound on the expected length of any 2-Opt improvement path of O~(n4+1/3⋅ϕ8/3) . When starting with an initial tour computed by an insertion heuristic, the upper bound on the expected number of steps improves even to O~(n4+1/3−1/d⋅ϕ8/3) . If the distances are measured according to the Manhattan metric, then the expected number of steps is bounded by O~(n4−1/d⋅ϕ) . In addition, we prove an upper bound of O(ϕ√d) on the expected approximation factor with respect to all L p metrics. Let us remark that our probabilistic analysis covers as special cases the uniform input model with ϕ=1 and a smoothed analysis with Gaussian perturbations of standard deviation σ with ϕ∌1/σ d

    Bayes and health care research.

    Get PDF
    Bayes’ rule shows how one might rationally change one’s beliefs in the light of evidence. It is the foundation of a statistical method called Bayesianism. In health care research, Bayesianism has its advocates but the dominant statistical method is frequentism. There are at least two important philosophical differences between these methods. First, Bayesianism takes a subjectivist view of probability (i.e. that probability scores are statements of subjective belief, not objective fact) whilst frequentism takes an objectivist view. Second, Bayesianism is explicitly inductive (i.e. it shows how we may induce views about the world based on partial data from it) whereas frequentism is at least compatible with non-inductive views of scientific method, particularly the critical realism of Popper. Popper and others detail significant problems with induction. Frequentism’s apparent ability to avoid these, plus its ability to give a seemingly more scientific and objective take on probability, lies behind its philosophical appeal to health care researchers. However, there are also significant problems with frequentism, particularly its inability to assign probability scores to single events. Popper thus proposed an alternative objectivist view of probability, called propensity theory, which he allies to a theory of corroboration; but this too has significant problems, in particular, it may not successfully avoid induction. If this is so then Bayesianism might be philosophically the strongest of the statistical approaches. The article sets out a number of its philosophical and methodological attractions. Finally, it outlines a way in which critical realism and Bayesianism might work together. </p
    • 

    corecore