269 research outputs found
Access to Population-Level Signaling as a Source of Inequality
We identify and explore differential access to population-level signaling
(also known as information design) as a source of unequal access to
opportunity. A population-level signaler has potentially noisy observations of
a binary type for each member of a population and, based on this, produces a
signal about each member. A decision-maker infers types from signals and
accepts those individuals whose type is high in expectation. We assume the
signaler of the disadvantaged population reveals her observations to the
decision-maker, whereas the signaler of the advantaged population forms signals
strategically. We study the expected utility of the populations as measured by
the fraction of accepted members, as well as the false positive rates (FPR) and
false negative rates (FNR).
We first show the intuitive results that for a fixed environment, the
advantaged population has higher expected utility, higher FPR, and lower FNR,
than the disadvantaged one (despite having identical population quality), and
that more accurate observations improve the expected utility of the advantaged
population while harming that of the disadvantaged one. We next explore the
introduction of a publicly-observable signal, such as a test score, as a
potential intervention. Our main finding is that this natural intervention,
intended to reduce the inequality between the populations' utilities, may
actually exacerbate it in settings where observations and test scores are
noisy
Constrained Signaling in Auction Design
We consider the problem of an auctioneer who faces the task of selling a good
(drawn from a known distribution) to a set of buyers, when the auctioneer does
not have the capacity to describe to the buyers the exact identity of the good
that he is selling. Instead, he must come up with a constrained signalling
scheme: a (non injective) mapping from goods to signals, that satisfies the
constraints of his setting. For example, the auctioneer may be able to
communicate only a bounded length message for each good, or he might be legally
constrained in how he can advertise the item being sold. Each candidate
signaling scheme induces an incomplete-information game among the buyers, and
the goal of the auctioneer is to choose the signaling scheme and accompanying
auction format that optimizes welfare. In this paper, we use techniques from
submodular function maximization and no-regret learning to give algorithms for
computing constrained signaling schemes for a variety of constrained signaling
problems
Reducing Inefficiency in Carbon Auctions with Imperfect Competition
We study auctions for carbon licenses, a policy tool used to control the social cost of pollution. Each identical license grants the right to produce a unit of pollution. Each buyer (i.e., firm that pollutes during the manufacturing process) enjoys a decreasing marginal value for licenses, but society suffers an increasing marginal cost for each license distributed. The seller (i.e., the government) can choose a number of licenses to put up for auction, and wishes to maximize the societal welfare: the total economic value of the buyers minus the social cost. Motivated by emission license markets deployed in practice, we focus on uniform price auctions with a price floor and/or price ceiling. The seller has distributional information about the market, and their goal is to tune the auction parameters to maximize expected welfare. The target benchmark is the maximum expected welfare achievable by any such auction under truth-telling behavior. Unfortunately, the uniform price auction is not truthful, and strategic behavior can significantly reduce (even below zero) the welfare of a given auction configuration.
We describe a subclass of "safe-price" auctions for which the welfare at any Bayes-Nash equilibrium will approximate the welfare under truth-telling behavior. We then show that the better of a safe-price auction, or a truthful auction that allocates licenses to only a single buyer, will approximate the target benchmark. In particular, we show how to choose a number of licenses and a price floor so that the worst-case welfare, at any equilibrium, is a constant approximation to the best achievable welfare under truth-telling after excluding the welfare contribution of a single buyer
Greedy Algorithms for Steiner Forest
In the Steiner Forest problem, we are given terminal pairs ,
and need to find the cheapest subgraph which connects each of the terminal
pairs together. In 1991, Agrawal, Klein, and Ravi, and Goemans and Williamson
gave primal-dual constant-factor approximation algorithms for this problem;
until now, the only constant-factor approximations we know are via linear
programming relaxations.
We consider the following greedy algorithm: Given terminal pairs in a metric
space, call a terminal "active" if its distance to its partner is non-zero.
Pick the two closest active terminals (say ), set the distance
between them to zero, and buy a path connecting them. Recompute the metric, and
repeat. Our main result is that this algorithm is a constant-factor
approximation.
We also use this algorithm to give new, simpler constructions of cost-sharing
schemes for Steiner forest. In particular, the first "group-strict" cost-shares
for this problem implies a very simple combinatorial sampling-based algorithm
for stochastic Steiner forest
- …