We provide simple but surprisingly useful direct product theorems for proving
lower bounds on online algorithms with a limited amount of advice about the
future. As a consequence, we are able to translate decades of research on
randomized online algorithms to the advice complexity model. Doing so improves
significantly on the previous best advice complexity lower bounds for many
online problems, or provides the first known lower bounds. For example, if n
is the number of requests, we show that:
(1) A paging algorithm needs Ω(n) bits of advice to achieve a
competitive ratio better than Hk=Ω(logk), where k is the cache
size. Previously, it was only known that Ω(n) bits of advice were
necessary to achieve a constant competitive ratio smaller than 5/4.
(2) Every O(n1−ε)-competitive vertex coloring algorithm must
use Ω(nlogn) bits of advice. Previously, it was only known that
Ω(nlogn) bits of advice were necessary to be optimal.
For certain online problems, including the MTS, k-server, paging, list
update, and dynamic binary search tree problem, our results imply that
randomization and sublinear advice are equally powerful (if the underlying
metric space or node set is finite). This means that several long-standing open
questions regarding randomized online algorithms can be equivalently stated as
questions regarding online algorithms with sublinear advice. For example, we
show that there exists a deterministic O(logk)-competitive k-server
algorithm with advice complexity o(n) if and only if there exists a
randomized O(logk)-competitive k-server algorithm without advice.
Technically, our main direct product theorem is obtained by extending an
information theoretical lower bound technique due to Emek, Fraigniaud, Korman,
and Ros\'en [ICALP'09]