125 research outputs found

    The analysis of approximate quickselect and related problems

    Get PDF
    Approximate Quickselect, a simple modification of the well known Quickselect algorithm for selection, can be used to efficiently find an element with rank k in a given range [i..j], out of n given elements. We study basic cost measures of Approximate Quickselect by computing exact and asymptotic results for the expected number of passes, comparisons and data moves during the execution of this algorithm. The key element appearing in the analysis of Approximate Quickselect is a trivariate recurrence that we solve in full generality. The general solution of the recurrence proves to be very useful, as it allows us to tackle several related problems, besides the analysis that originally motivated us. In particular, we have been able to carry out a precise analysis of the expected number of moves of the ith element when selecting the jth smallest element with standard Quickselect, where we are able to give both exact and asymptotic results. Moreover, we can apply our general results to obtain exact and asymptotic results for several parameters in binary search trees, namely the expected number of common ancestors of the nodes with rank i and j, the expected size of the subtree rooted at the least common ancestor of the nodes with rank i and j, and the expected distance between the nodes of ranks i and j

    Fast Deterministic Selection

    Get PDF
    The Median of Medians (also known as BFPRT) algorithm, although a landmark theoretical achievement, is seldom used in practice because it and its variants are slower than simple approaches based on sampling. The main contribution of this paper is a fast linear-time deterministic selection algorithm QuickselectAdaptive based on a refined definition of MedianOfMedians. The algorithm's performance brings deterministic selection---along with its desirable properties of reproducible runs, predictable run times, and immunity to pathological inputs---in the range of practicality. We demonstrate results on independent and identically distributed random inputs and on normally-distributed inputs. Measurements show that QuickselectAdaptive is faster than state-of-the-art baselines.Comment: Pre-publication draf

    Automated Tail Bound Analysis for Probabilistic Recurrence Relations

    Full text link
    Probabilistic recurrence relations (PRRs) are a standard formalism for describing the runtime of a randomized algorithm. Given a PRR and a time limit κ\kappa, we consider the classical concept of tail probability Pr[Tκ]\Pr[T \ge \kappa], i.e., the probability that the randomized runtime TT of the PRR exceeds the time limit κ\kappa. Our focus is the formal analysis of tail bounds that aims at finding a tight asymptotic upper bound uPr[Tκ]u \geq \Pr[T\ge\kappa] in the time limit κ\kappa. To address this problem, the classical and most well-known approach is the cookbook method by Karp (JACM 1994), while other approaches are mostly limited to deriving tail bounds of specific PRRs via involved custom analysis. In this work, we propose a novel approach for deriving exponentially-decreasing tail bounds (a common type of tail bounds) for PRRs whose preprocessing time and random passed sizes observe discrete or (piecewise) uniform distribution and whose recursive call is either a single procedure call or a divide-and-conquer. We first establish a theoretical approach via Markov's inequality, and then instantiate the theoretical approach with a template-based algorithmic approach via a refined treatment of exponentiation. Experimental evaluation shows that our algorithmic approach is capable of deriving tail bounds that are (i) asymptotically tighter than Karp's method, (ii) match the best-known manually-derived asymptotic tail bound for QuickSelect, and (iii) is only slightly worse (with a loglogn\log\log n factor) than the manually-proven optimal asymptotic tail bound for QuickSort. Moreover, our algorithmic approach handles all examples (including realistic PRRs such as QuickSort, QuickSelect, DiameterComputation, etc.) in less than 0.1 seconds, showing that our approach is efficient in practice.Comment: 46 pages, 15 figure

    On the contraction method with degenerate limit equation

    Full text link
    A class of random recursive sequences (Y_n) with slowly varying variances as arising for parameters of random trees or recursive algorithms leads after normalizations to degenerate limit equations of the form X\stackrel{L}{=}X. For nondegenerate limit equations the contraction method is a main tool to establish convergence of the scaled sequence to the ``unique'' solution of the limit equation. In this paper we develop an extension of the contraction method which allows us to derive limit theorems for parameters of algorithms and data structures with degenerate limit equation. In particular, we establish some new tools and a general convergence scheme, which transfers information on mean and variance into a central limit law (with normal limit). We also obtain a convergence rate result. For the proof we use selfdecomposability properties of the limit normal distribution which allow us to mimic the recursive sequence by an accompanying sequence in normal variables.Comment: Published by the Institute of Mathematical Statistics (http://www.imstat.org) in the Annals of Probability (http://www.imstat.org/aop/) at http://dx.doi.org/10.1214/00911790400000017

    Efficient Source Finding for Radio Interferometric Images

    Full text link
    Object detection in astronomical images, generically referred to as source finding, is often performed before the object characterisation stage in astrophysical processing work flows. In radio astronomy, source finding has historically been performed by bespoke off-line systems; however, modern data acquisition systems as well as those proposed for upcoming observatories such as the Square Kilometre Array (SKA), will make this approach unfeasible. One area where a change of approach is particularly necessary is in the design of fast imaging systems for transient studies. This paper presents a number of advances in accelerating and automating the source finding in such systems.Comment: submitted to Astronomy & Computin
    corecore