823 research outputs found
Learning Reserve Prices in Second-Price Auctions
This paper proves the tight sample complexity of Second-Price Auction with
Anonymous Reserve, up to a logarithmic factor, for all value distribution
families that have been considered in the literature. Compared to Myerson
Auction, whose sample complexity was settled very recently in (Guo, Huang and
Zhang, STOC 2019), Anonymous Reserve requires much fewer samples for learning.
We follow a similar framework as the Guo-Huang-Zhang work, but replace their
information theoretical argument with a direct proof
Sample Complexity of Forecast Aggregation
We consider a Bayesian forecast aggregation model where experts, after
observing private signals about an unknown binary event, report their posterior
beliefs about the event to a principal, who then aggregates the reports into a
single prediction for the event. The signals of the experts and the outcome of
the event follow a joint distribution that is unknown to the principal, but the
principal has access to i.i.d. "samples" from the distribution, where each
sample is a tuple of the experts' reports (not signals) and the realization of
the event. Using these samples, the principal aims to find an
-approximately optimal aggregator, where optimality is measured in
terms of the expected squared distance between the aggregated prediction and
the realization of the event. We show that the sample complexity of this
problem is at least for arbitrary
discrete distributions, where is the size of each expert's signal space.
This sample complexity grows exponentially in the number of experts . But,
if the experts' signals are independent conditioned on the realization of the
event, then the sample complexity is significantly reduced, to , which does not depend on . Our results can be generalized
to non-binary events. The proof of our results uses a reduction from the
distribution learning problem and reveals the fact that forecast aggregation is
almost as difficult as distribution learning.Comment: Update related works. Add new results and discussion
- …