15 research outputs found
Dual-Pivot Quicksort: Optimality, Analysis and Zeros of Associated Lattice Paths
We present an average case analysis of a variant of dual-pivot quicksort. We
show that the used algorithmic partitioning strategy is optimal, i.e., it
minimizes the expected number of key comparisons. For the analysis, we
calculate the expected number of comparisons exactly as well as asymptotically,
in particular, we provide exact expressions for the linear, logarithmic, and
constant terms.
An essential step is the analysis of zeros of lattice paths in a certain
probability model. Along the way a combinatorial identity is proven.Comment: This article supersedes arXiv:1602.0403
How Good Is Multi-Pivot Quicksort?
Multi-Pivot Quicksort refers to variants of classical quicksort where in the
partitioning step pivots are used to split the input into segments.
For many years, multi-pivot quicksort was regarded as impractical, but in 2009
a 2-pivot approach by Yaroslavskiy, Bentley, and Bloch was chosen as the
standard sorting algorithm in Sun's Java 7. In 2014 at ALENEX, Kushagra et al.
introduced an even faster algorithm that uses three pivots. This paper studies
what possible advantages multi-pivot quicksort might offer in general. The
contributions are as follows: Natural comparison-optimal algorithms for
multi-pivot quicksort are devised and analyzed. The analysis shows that the
benefits of using multiple pivots with respect to the average comparison count
are marginal and these strategies are inferior to simpler strategies such as
the well known median-of- approach. A substantial part of the partitioning
cost is caused by rearranging elements. A rigorous analysis of an algorithm for
rearranging elements in the partitioning step is carried out, observing mainly
how often array cells are accessed during partitioning. The algorithm behaves
best if 3 to 5 pivots are used. Experiments show that this translates into good
cache behavior and is closest to predicting observed running times of
multi-pivot quicksort algorithms. Finally, it is studied how choosing pivots
from a sample affects sorting cost. The study is theoretical in the sense that
although the findings motivate design recommendations for multipivot quicksort
algorithms that lead to running time improvements over known algorithms in an
experimental setting, these improvements are small.Comment: Submitted to a journal, v2: Fixed statement of Gibb's inequality, v3:
Revised version, especially improving on the experiments in Section
Algorithms Seminar, 2001-2002
These seminar notes constitute the proceedings of a seminar devoted to the analysis of algorithms and related topics. The subjects covered include combinatorics, symbolic computation, asymptotic analysis, number theory, as well as the analysis of algorithms, data structures, and network protocols
A functional limit theorem for the profile of search trees
We study the profile of random search trees including binary search
trees and -ary search trees. Our main result is a functional limit theorem
of the normalized profile for in a certain range of . A central feature of the proof is the
use of the contraction method to prove convergence in distribution of certain
random analytic functions in a complex domain. This is based on a general
theorem concerning the contraction method for random variables in an
infinite-dimensional Hilbert space. As part of the proof, we show that the
Zolotarev metric is complete for a Hilbert space.Comment: Published in at http://dx.doi.org/10.1214/07-AAP457 the Annals of
Applied Probability (http://www.imstat.org/aap/) by the Institute of
Mathematical Statistics (http://www.imstat.org
28th Annual Symposium on Combinatorial Pattern Matching : CPM 2017, July 4-6, 2017, Warsaw, Poland
Peer reviewe
LIPIcs, Volume 261, ICALP 2023, Complete Volume
LIPIcs, Volume 261, ICALP 2023, Complete Volum
LIPIcs, Volume 274, ESA 2023, Complete Volume
LIPIcs, Volume 274, ESA 2023, Complete Volum