3 research outputs found
OPTIMAL ADAPTIVE ALGORITHMS FOR FINDING THE NEAREST AND FARTHEST POINT ON A PARAMETRIC BLACK-BOX CURVE
Received (received date
Instance Optimal Geometric Algorithms
We prove the existence of an algorithm for computing 2-d or 3-d convex
hulls that is optimal for every point set in the following sense: for every
sequence of points and for every algorithm in a certain class
, the running time of on input is at most a constant
factor times the maximum running time of on the worst possible permutation
of for . We establish a stronger property: for every sequence
of points and every algorithm , the running time of on
is at most a constant factor times the average running time of
over all permutations of . We call algorithms satisfying these
properties instance-optimal in the order-oblivious and random-order setting.
Such instance-optimal algorithms simultaneously subsume output-sensitive
algorithms and distribution-dependent average-case algorithms, and all
algorithms that do not take advantage of the order of the input or that assume
the input is given in a random order. The class under
consideration consists of all algorithms in a decision tree model where the
tests involve only multilinear functions with a constant number of arguments.
To establish an instance-specific lower bound, we deviate from traditional
Ben-Or-style proofs and adopt a new adversary argument. For 2-d convex hulls,
we prove that a version of the well known algorithm by Kirkpatrick and Seidel
(1986) or Chan, Snoeyink, and Yap (1995) already attains this lower bound. For
3-d convex hulls, we propose a new algorithm. We further obtain
instance-optimal results for a few other standard problems in computational
geometry. Our framework also reveals connection to distribution-sensitive data
structures and yields new results as a byproduct, for example, on on-line
orthogonal range searching in 2-d and on-line halfspace range reporting in 2-d
and 3-d.Comment: 28 pages in fullpag
Instance-Optimality in the Noisy Value-and Comparison-Model --- Accept, Accept, Strong Accept: Which Papers get in?
Motivated by crowdsourced computation, peer-grading, and recommendation
systems, Braverman, Mao and Weinberg [STOC'16] studied the \emph{query} and
\emph{round} complexity of fundamental problems such as finding the maximum
(\textsc{max}), finding all elements above a certain value
(\textsc{threshold-}) or computing the top elements (\textsc{Top}-)
in a noisy environment.
For example, consider the task of selecting papers for a conference. This
task is challenging due the crowdsourcing nature of peer reviews: the results
of reviews are noisy and it is necessary to parallelize the review process as
much as possible. We study the noisy value model and the noisy comparison
model: In the \emph{noisy value model}, a reviewer is asked to evaluate a
single element: "What is the value of paper ?" (\eg accept). In the
\emph{noisy comparison model} (introduced in the seminal work of Feige, Peleg,
Raghavan and Upfal [SICOMP'94]) a reviewer is asked to do a pairwise
comparison: "Is paper better than paper ?"
In this paper, we show optimal worst-case query complexity for the
\textsc{max},\textsc{threshold-} and \textsc{Top}- problems. For
\textsc{max} and \textsc{Top}-, we obtain optimal worst-case upper and lower
bounds on the round vs query complexity in both models. For
\textsc{threshold}-, we obtain optimal query complexity and nearly-optimal
round complexity, where is the size of the output) for both models.
We then go beyond the worst-case and address the question of the importance
of knowledge of the instance by providing, for a large range of parameters,
instance-optimal algorithms with respect to the query complexity. Furthermore,
we show that the value model is strictly easier than the comparison model