We consider online prediction of a binary sequence with expert advice. For
this setting, we devise label-efficient forecasting algorithms, which use a
selective sampling scheme that enables collecting much fewer labels than
standard procedures, while still retaining optimal worst-case regret
guarantees. These algorithms are based on exponentially weighted forecasters,
suitable for settings with and without a perfect expert. For a scenario where
one expert is strictly better than the others in expectation, we show that the
label complexity of the label-efficient forecaster scales roughly as the square
root of the number of rounds. Finally, we present numerical experiments
empirically showing that the normalized regret of the label-efficient
forecaster can asymptotically match known minimax rates for pool-based active
learning, suggesting it can optimally adapt to benign settings