3,644 research outputs found

    The Folklore of Sorting Algorithms

    Get PDF
    The objective of this paper is to review the folklore knowledge seen in research work devoted on synthesis, optimization, and effectiveness of various sorting algorithms. We will examine sorting algorithms in the folklore lines and try to discover the tradeoffs between folklore and theorems. Finally, the folklore knowledge on complexity values of the sorting algorithms will be considered, verified and subsequently converged in to theorems

    Average-Case Complexity of Shellsort

    Full text link
    We prove a general lower bound on the average-case complexity of Shellsort: the average number of data-movements (and comparisons) made by a pp-pass Shellsort for any incremental sequence is \Omega (pn^{1 + 1/p) for all plognp \leq \log n. Using similar arguments, we analyze the average-case complexity of several other sorting algorithms.Comment: 11 pages. Submitted to ICALP'9

    Large Alphabets and Incompressibility

    Full text link
    We briefly survey some concepts related to empirical entropy -- normal numbers, de Bruijn sequences and Markov processes -- and investigate how well it approximates Kolmogorov complexity. Our results suggest \ellth-order empirical entropy stops being a reasonable complexity metric for almost all strings of length mm over alphabets of size nn about when nn^\ell surpasses mm

    New Algorithms and Lower Bounds for Sequential-Access Data Compression

    Get PDF
    This thesis concerns sequential-access data compression, i.e., by algorithms that read the input one or more times from beginning to end. In one chapter we consider adaptive prefix coding, for which we must read the input character by character, outputting each character's self-delimiting codeword before reading the next one. We show how to encode and decode each character in constant worst-case time while producing an encoding whose length is worst-case optimal. In another chapter we consider one-pass compression with memory bounded in terms of the alphabet size and context length, and prove a nearly tight tradeoff between the amount of memory we can use and the quality of the compression we can achieve. In a third chapter we consider compression in the read/write streams model, which allows us passes and memory both polylogarithmic in the size of the input. We first show how to achieve universal compression using only one pass over one stream. We then show that one stream is not sufficient for achieving good grammar-based compression. Finally, we show that two streams are necessary and sufficient for achieving entropy-only bounds.Comment: draft of PhD thesi

    Lower Bounds on Quantum Query Complexity

    Full text link
    Shor's and Grover's famous quantum algorithms for factoring and searching show that quantum computers can solve certain computational problems significantly faster than any classical computer. We discuss here what quantum computers_cannot_ do, and specifically how to prove limits on their computational power. We cover the main known techniques for proving lower bounds, and exemplify and compare the methods.Comment: survey, 23 page
    corecore