2,491 research outputs found
Smoothness for Simultaneous Composition of Mechanisms with Admission
We study social welfare of learning outcomes in mechanisms with admission. In
our repeated game there are bidders and mechanisms, and in each round
each mechanism is available for each bidder only with a certain probability.
Our scenario is an elementary case of simple mechanism design with incomplete
information, where availabilities are bidder types. It captures natural
applications in online markets with limited supply and can be used to model
access of unreliable channels in wireless networks.
If mechanisms satisfy a smoothness guarantee, existing results show that
learning outcomes recover a significant fraction of the optimal social welfare.
These approaches, however, have serious drawbacks in terms of plausibility and
computational complexity. Also, the guarantees apply only when availabilities
are stochastically independent among bidders.
In contrast, we propose an alternative approach where each bidder uses a
single no-regret learning algorithm and applies it in all rounds. This results
in what we call availability-oblivious coarse correlated equilibria. It
exponentially decreases the learning burden, simplifies implementation (e.g.,
as a method for channel access in wireless devices), and thereby addresses some
of the concerns about Bayes-Nash equilibria and learning outcomes in Bayesian
settings. Our main results are general composition theorems for smooth
mechanisms when valuation functions of bidders are lattice-submodular. They
rely on an interesting connection to the notion of correlation gap of
submodular functions over product lattices.Comment: Full version of WINE 2016 pape
Universality of Bayesian mixture predictors
The problem is that of sequential probability forecasting for finite-valued
time series. The data is generated by an unknown probability distribution over
the space of all one-way infinite sequences. It is known that this measure
belongs to a given set C, but the latter is completely arbitrary (uncountably
infinite, without any structure given). The performance is measured with
asymptotic average log loss. In this work it is shown that the minimax
asymptotic performance is always attainable, and it is attained by a convex
combination of a countably many measures from the set C (a Bayesian mixture).
This was previously only known for the case when the best achievable asymptotic
error is 0. This also contrasts previous results that show that in the
non-realizable case all Bayesian mixtures may be suboptimal, while there is a
predictor that achieves the optimal performance
- …