2,013 research outputs found
Time-Space Tradeoffs for the Memory Game
A single-player game of Memory is played with distinct pairs of cards,
with the cards in each pair bearing identical pictures. The cards are laid
face-down. A move consists of revealing two cards, chosen adaptively. If these
cards match, i.e., they bear the same picture, they are removed from play;
otherwise, they are turned back to face down. The object of the game is to
clear all cards while minimizing the number of moves. Past works have
thoroughly studied the expected number of moves required, assuming optimal play
by a player has that has perfect memory. In this work, we study the Memory game
in a space-bounded setting.
We prove two time-space tradeoff lower bounds on algorithms (strategies for
the player) that clear all cards in moves while using at most bits of
memory. First, in a simple model where the pictures on the cards may only be
compared for equality, we prove that . This is tight:
it is easy to achieve essentially everywhere on this
tradeoff curve. Second, in a more general model that allows arbitrary
computations, we prove that . We prove this latter tradeoff
by modeling strategies as branching programs and extending a classic counting
argument of Borodin and Cook with a novel probabilistic argument. We conjecture
that the stronger tradeoff in fact holds even in
this general model
Noise-adaptive Margin-based Active Learning and Lower Bounds under Tsybakov Noise Condition
We present a simple noise-robust margin-based active learning algorithm to
find homogeneous (passing the origin) linear separators and analyze its error
convergence when labels are corrupted by noise. We show that when the imposed
noise satisfies the Tsybakov low noise condition (Mammen, Tsybakov, and others
1999; Tsybakov 2004) the algorithm is able to adapt to unknown level of noise
and achieves optimal statistical rate up to poly-logarithmic factors. We also
derive lower bounds for margin based active learning algorithms under Tsybakov
noise conditions (TNC) for the membership query synthesis scenario (Angluin
1988). Our result implies lower bounds for the stream based selective sampling
scenario (Cohn 1990) under TNC for some fairly simple data distributions. Quite
surprisingly, we show that the sample complexity cannot be improved even if the
underlying data distribution is as simple as the uniform distribution on the
unit ball. Our proof involves the construction of a well separated hypothesis
set on the d-dimensional unit ball along with carefully designed label
distributions for the Tsybakov noise condition. Our analysis might provide
insights for other forms of lower bounds as well.Comment: 16 pages, 2 figures. An abridged version to appear in Thirtieth AAAI
Conference on Artificial Intelligence (AAAI), which is held in Phoenix, AZ
USA in 201
- …