24 research outputs found
On the Efficiency of the Proportional Allocation Mechanism for Divisible Resources
We study the efficiency of the proportional allocation mechanism, that is
widely used to allocate divisible resources. Each agent submits a bid for each
divisible resource and receives a fraction proportional to her bids. We
quantify the inefficiency of Nash equilibria by studying the Price of Anarchy
(PoA) of the induced game under complete and incomplete information. When
agents' valuations are concave, we show that the Bayesian Nash equilibria can
be arbitrarily inefficient, in contrast to the well-known 4/3 bound for pure
equilibria. Next, we upper bound the PoA over Bayesian equilibria by 2 when
agents' valuations are subadditive, generalizing and strengthening previous
bounds on lattice submodular valuations. Furthermore, we show that this bound
is tight and cannot be improved by any simple or scale-free mechanism. Then we
switch to settings with budget constraints, and we show an improved upper bound
on the PoA over coarse-correlated equilibria. Finally, we prove that the PoA is
exactly 2 for pure equilibria in the polyhedral environment.Comment: To appear in SAGT 201
Almost Optimal Streaming Algorithms for Coverage Problems
Maximum coverage and minimum set cover problems --collectively called
coverage problems-- have been studied extensively in streaming models. However,
previous research not only achieve sub-optimal approximation factors and space
complexities, but also study a restricted set arrival model which makes an
explicit or implicit assumption on oracle access to the sets, ignoring the
complexity of reading and storing the whole set at once. In this paper, we
address the above shortcomings, and present algorithms with improved
approximation factor and improved space complexity, and prove that our results
are almost tight. Moreover, unlike most of previous work, our results hold on a
more general edge arrival model. More specifically, we present (almost) optimal
approximation algorithms for maximum coverage and minimum set cover problems in
the streaming model with an (almost) optimal space complexity of
, i.e., the space is {\em independent of the size of the sets or
the size of the ground set of elements}. These results not only improve over
the best known algorithms for the set arrival model, but also are the first
such algorithms for the more powerful {\em edge arrival} model. In order to
achieve the above results, we introduce a new general sketching technique for
coverage functions: This sketching scheme can be applied to convert an
-approximation algorithm for a coverage problem to a
(1-\eps)\alpha-approximation algorithm for the same problem in streaming, or
RAM models. We show the significance of our sketching technique by ruling out
the possibility of solving coverage problems via accessing (as a black box) a
(1 \pm \eps)-approximate oracle (e.g., a sketch function) that estimates the
coverage function on any subfamily of the sets
Incidence Geometries and the Pass Complexity of Semi-Streaming Set Cover
Set cover, over a universe of size , may be modelled as a data-streaming
problem, where the sets that comprise the instance are to be read one by
one. A semi-streaming algorithm is allowed only space to process this stream. For each , we give a very
simple deterministic algorithm that makes passes over the input stream and
returns an appropriately certified -approximation to the
optimum set cover. More importantly, we proceed to show that this approximation
factor is essentially tight, by showing that a factor better than
is unachievable for a -pass semi-streaming
algorithm, even allowing randomisation. In particular, this implies that
achieving a -approximation requires
passes, which is tight up to the factor. These results extend to a
relaxation of the set cover problem where we are allowed to leave an
fraction of the universe uncovered: the tight bounds on the best
approximation factor achievable in passes turn out to be
. Our lower bounds are based
on a construction of a family of high-rank incidence geometries, which may be
thought of as vast generalisations of affine planes. This construction, based
on algebraic techniques, appears flexible enough to find other applications and
is therefore interesting in its own right.Comment: 20 page
On the Complexity of Computing an Equilibrium in Combinatorial Auctions
We study combinatorial auctions where each item is sold separately but
simultaneously via a second price auction. We ask whether it is possible to
efficiently compute in this game a pure Nash equilibrium with social welfare
close to the optimal one.
We show that when the valuations of the bidders are submodular, in many
interesting settings (e.g., constant number of bidders, budget additive
bidders) computing an equilibrium with good welfare is essentially as easy as
computing, completely ignoring incentives issues, an allocation with good
welfare. On the other hand, for subadditive valuations, we show that computing
an equilibrium requires exponential communication. Finally, for XOS (a.k.a.
fractionally subadditive) valuations, we show that if there exists an efficient
algorithm that finds an equilibrium, it must use techniques that are very
different from our current ones
Towards Tight Bounds for the Streaming Set Cover Problem
We consider the classic Set Cover problem in the data stream model. For
elements and sets () we give a -pass algorithm with a
strongly sub-linear space and logarithmic
approximation factor. This yields a significant improvement over the earlier
algorithm of Demaine et al. [DIMV14] that uses exponentially larger number of
passes. We complement this result by showing that the tradeoff between the
number of passes and space exhibited by our algorithm is tight, at least when
the approximation factor is equal to . Specifically, we show that any
algorithm that computes set cover exactly using passes
must use space in the regime of .
Furthermore, we consider the problem in the geometric setting where the
elements are points in and sets are either discs, axis-parallel
rectangles, or fat triangles in the plane, and show that our algorithm (with a
slight modification) uses the optimal space to find a
logarithmic approximation in passes.
Finally, we show that any randomized one-pass algorithm that distinguishes
between covers of size 2 and 3 must use a linear (i.e., ) amount of
space. This is the first result showing that a randomized, approximate
algorithm cannot achieve a space bound that is sublinear in the input size.
This indicates that using multiple passes might be necessary in order to
achieve sub-linear space bounds for this problem while guaranteeing small
approximation factors.Comment: A preliminary version of this paper is to appear in PODS 201
Combinatorial Auctions Do Need Modest Interaction
We study the necessity of interaction for obtaining efficient allocations in
subadditive combinatorial auctions. This problem was originally introduced by
Dobzinski, Nisan, and Oren (STOC'14) as the following simple market scenario:
items are to be allocated among bidders in a distributed setting where
bidders valuations are private and hence communication is needed to obtain an
efficient allocation. The communication happens in rounds: in each round, each
bidder, simultaneously with others, broadcasts a message to all parties
involved and the central planner computes an allocation solely based on the
communicated messages. Dobzinski et.al. showed that no non-interactive
(-round) protocol with polynomial communication (in the number of items and
bidders) can achieve approximation ratio better than ,
while for any , there exists -round protocols that achieve
approximation with polynomial
communication; in particular, rounds of interaction suffice to
obtain an (almost) efficient allocation.
A natural question at this point is to identify the "right" level of
interaction (i.e., number of rounds) necessary to obtain an efficient
allocation. In this paper, we resolve this question by providing an almost
tight round-approximation tradeoff for this problem: we show that for any , any -round protocol that uses polynomial communication can only
approximate the social welfare up to a factor of . This in particular implies that
rounds of interaction are necessary for
obtaining any efficient allocation in these markets. Our work builds on the
recent multi-party round-elimination technique of Alon, Nisan, Raz, and
Weinstein (FOCS'15) and settles an open question posed by Dobzinski et.al. and
Alon et. al
Inapproximability of Truthful Mechanisms via Generalizations of the VC Dimension
Algorithmic mechanism design (AMD) studies the delicate interplay between
computational efficiency, truthfulness, and optimality. We focus on AMD's
paradigmatic problem: combinatorial auctions. We present a new generalization
of the VC dimension to multivalued collections of functions, which encompasses
the classical VC dimension, Natarajan dimension, and Steele dimension. We
present a corresponding generalization of the Sauer-Shelah Lemma and harness
this VC machinery to establish inapproximability results for deterministic
truthful mechanisms. Our results essentially unify all inapproximability
results for deterministic truthful mechanisms for combinatorial auctions to
date and establish new separation gaps between truthful and non-truthful
algorithms