3 research outputs found

    OPTIMAL ADAPTIVE ALGORITHMS FOR FINDING THE NEAREST AND FARTHEST POINT ON A PARAMETRIC BLACK-BOX CURVE

    No full text
    Received (received date

    Instance Optimal Geometric Algorithms

    Full text link
    We prove the existence of an algorithm AA for computing 2-d or 3-d convex hulls that is optimal for every point set in the following sense: for every sequence σ\sigma of nn points and for every algorithm A′A' in a certain class A\mathcal{A}, the running time of AA on input σ\sigma is at most a constant factor times the maximum running time of A′A' on the worst possible permutation of σ\sigma for A′A'. We establish a stronger property: for every sequence σ\sigma of points and every algorithm A′A', the running time of AA on σ\sigma is at most a constant factor times the average running time of A′A' over all permutations of σ\sigma. We call algorithms satisfying these properties instance-optimal in the order-oblivious and random-order setting. Such instance-optimal algorithms simultaneously subsume output-sensitive algorithms and distribution-dependent average-case algorithms, and all algorithms that do not take advantage of the order of the input or that assume the input is given in a random order. The class A\mathcal{A} under consideration consists of all algorithms in a decision tree model where the tests involve only multilinear functions with a constant number of arguments. To establish an instance-specific lower bound, we deviate from traditional Ben-Or-style proofs and adopt a new adversary argument. For 2-d convex hulls, we prove that a version of the well known algorithm by Kirkpatrick and Seidel (1986) or Chan, Snoeyink, and Yap (1995) already attains this lower bound. For 3-d convex hulls, we propose a new algorithm. We further obtain instance-optimal results for a few other standard problems in computational geometry. Our framework also reveals connection to distribution-sensitive data structures and yields new results as a byproduct, for example, on on-line orthogonal range searching in 2-d and on-line halfspace range reporting in 2-d and 3-d.Comment: 28 pages in fullpag

    Instance-Optimality in the Noisy Value-and Comparison-Model --- Accept, Accept, Strong Accept: Which Papers get in?

    Full text link
    Motivated by crowdsourced computation, peer-grading, and recommendation systems, Braverman, Mao and Weinberg [STOC'16] studied the \emph{query} and \emph{round} complexity of fundamental problems such as finding the maximum (\textsc{max}), finding all elements above a certain value (\textsc{threshold-vv}) or computing the top−k-k elements (\textsc{Top}-kk) in a noisy environment. For example, consider the task of selecting papers for a conference. This task is challenging due the crowdsourcing nature of peer reviews: the results of reviews are noisy and it is necessary to parallelize the review process as much as possible. We study the noisy value model and the noisy comparison model: In the \emph{noisy value model}, a reviewer is asked to evaluate a single element: "What is the value of paper ii?" (\eg accept). In the \emph{noisy comparison model} (introduced in the seminal work of Feige, Peleg, Raghavan and Upfal [SICOMP'94]) a reviewer is asked to do a pairwise comparison: "Is paper ii better than paper jj?" In this paper, we show optimal worst-case query complexity for the \textsc{max},\textsc{threshold-vv} and \textsc{Top}-kk problems. For \textsc{max} and \textsc{Top}-kk, we obtain optimal worst-case upper and lower bounds on the round vs query complexity in both models. For \textsc{threshold}-vv, we obtain optimal query complexity and nearly-optimal round complexity, where kk is the size of the output) for both models. We then go beyond the worst-case and address the question of the importance of knowledge of the instance by providing, for a large range of parameters, instance-optimal algorithms with respect to the query complexity. Furthermore, we show that the value model is strictly easier than the comparison model
    corecore