2,112 research outputs found

    Average-case analysis of perfect sorting by reversals (Journal Version)

    Full text link
    Perfect sorting by reversals, a problem originating in computational genomics, is the process of sorting a signed permutation to either the identity or to the reversed identity permutation, by a sequence of reversals that do not break any common interval. B\'erard et al. (2007) make use of strong interval trees to describe an algorithm for sorting signed permutations by reversals. Combinatorial properties of this family of trees are essential to the algorithm analysis. Here, we use the expected value of certain tree parameters to prove that the average run-time of the algorithm is at worst, polynomial, and additionally, for sufficiently long permutations, the sorting algorithm runs in polynomial time with probability one. Furthermore, our analysis of the subclass of commuting scenarios yields precise results on the average length of a reversal, and the average number of reversals.Comment: A preliminary version of this work appeared in the proceedings of Combinatorial Pattern Matching (CPM) 2009. See arXiv:0901.2847; Discrete Mathematics, Algorithms and Applications, vol. 3(3), 201

    Average-case analysis via incompressibility

    Get PDF

    Heuristic average-case analysis of the backtrack resolution of random 3-Satisfiability instances

    Full text link
    An analysis of the average-case complexity of solving random 3-Satisfiability (SAT) instances with backtrack algorithms is presented. We first interpret previous rigorous works in a unifying framework based on the statistical physics notions of dynamical trajectories, phase diagram and growth process. It is argued that, under the action of the Davis--Putnam--Loveland--Logemann (DPLL) algorithm, 3-SAT instances are turned into 2+p-SAT instances whose characteristic parameters (ratio alpha of clauses per variable, fraction p of 3-clauses) can be followed during the operation, and define resolution trajectories. Depending on the location of trajectories in the phase diagram of the 2+p-SAT model, easy (polynomial) or hard (exponential) resolutions are generated. Three regimes are identified, depending on the ratio alpha of the 3-SAT instance to be solved. Lower sat phase: for small ratios, DPLL almost surely finds a solution in a time growing linearly with the number N of variables. Upper sat phase: for intermediate ratios, instances are almost surely satisfiable but finding a solution requires exponential time (2 ^ (N omega) with omega>0) with high probability. Unsat phase: for large ratios, there is almost always no solution and proofs of refutation are exponential. An analysis of the growth of the search tree in both upper sat and unsat regimes is presented, and allows us to estimate omega as a function of alpha. This analysis is based on an exact relationship between the average size of the search tree and the powers of the evolution operator encoding the elementary steps of the search heuristic.Comment: to appear in Theoretical Computer Scienc

    Average Case Analysis of the Classical Algorithm for Markov Decision Processes with B\"uchi Objectives

    Full text link
    We consider Markov decision processes (MDPs) with ω\omega-regular specifications given as parity objectives. We consider the problem of computing the set of almost-sure winning vertices from where the objective can be ensured with probability 1. The algorithms for the computation of the almost-sure winning set for parity objectives iteratively use the solutions for the almost-sure winning set for B\"uchi objectives (a special case of parity objectives). We study for the first time the average case complexity of the classical algorithm for computing almost-sure winning vertices for MDPs with B\"uchi objectives. Our contributions are as follows: First, we show that for MDPs with constant out-degree the expected number of iterations is at most logarithmic and the average case running time is linear (as compared to the worst case linear number of iterations and quadratic time complexity). Second, we show that for general MDPs the expected number of iterations is constant and the average case running time is linear (again as compared to the worst case linear number of iterations and quadratic time complexity). Finally we also show that given all graphs are equally likely, the probability that the classical algorithm requires more than constant number of iterations is exponentially small
    • …
    corecore