24 research outputs found

    On the Efficiency of the Proportional Allocation Mechanism for Divisible Resources

    Get PDF
    We study the efficiency of the proportional allocation mechanism, that is widely used to allocate divisible resources. Each agent submits a bid for each divisible resource and receives a fraction proportional to her bids. We quantify the inefficiency of Nash equilibria by studying the Price of Anarchy (PoA) of the induced game under complete and incomplete information. When agents' valuations are concave, we show that the Bayesian Nash equilibria can be arbitrarily inefficient, in contrast to the well-known 4/3 bound for pure equilibria. Next, we upper bound the PoA over Bayesian equilibria by 2 when agents' valuations are subadditive, generalizing and strengthening previous bounds on lattice submodular valuations. Furthermore, we show that this bound is tight and cannot be improved by any simple or scale-free mechanism. Then we switch to settings with budget constraints, and we show an improved upper bound on the PoA over coarse-correlated equilibria. Finally, we prove that the PoA is exactly 2 for pure equilibria in the polyhedral environment.Comment: To appear in SAGT 201

    Almost Optimal Streaming Algorithms for Coverage Problems

    Full text link
    Maximum coverage and minimum set cover problems --collectively called coverage problems-- have been studied extensively in streaming models. However, previous research not only achieve sub-optimal approximation factors and space complexities, but also study a restricted set arrival model which makes an explicit or implicit assumption on oracle access to the sets, ignoring the complexity of reading and storing the whole set at once. In this paper, we address the above shortcomings, and present algorithms with improved approximation factor and improved space complexity, and prove that our results are almost tight. Moreover, unlike most of previous work, our results hold on a more general edge arrival model. More specifically, we present (almost) optimal approximation algorithms for maximum coverage and minimum set cover problems in the streaming model with an (almost) optimal space complexity of O~(n)\tilde{O}(n), i.e., the space is {\em independent of the size of the sets or the size of the ground set of elements}. These results not only improve over the best known algorithms for the set arrival model, but also are the first such algorithms for the more powerful {\em edge arrival} model. In order to achieve the above results, we introduce a new general sketching technique for coverage functions: This sketching scheme can be applied to convert an α\alpha-approximation algorithm for a coverage problem to a (1-\eps)\alpha-approximation algorithm for the same problem in streaming, or RAM models. We show the significance of our sketching technique by ruling out the possibility of solving coverage problems via accessing (as a black box) a (1 \pm \eps)-approximate oracle (e.g., a sketch function) that estimates the coverage function on any subfamily of the sets

    Incidence Geometries and the Pass Complexity of Semi-Streaming Set Cover

    Full text link
    Set cover, over a universe of size nn, may be modelled as a data-streaming problem, where the mm sets that comprise the instance are to be read one by one. A semi-streaming algorithm is allowed only O(npoly{logn,logm})O(n\, \mathrm{poly}\{\log n, \log m\}) space to process this stream. For each p1p \ge 1, we give a very simple deterministic algorithm that makes pp passes over the input stream and returns an appropriately certified (p+1)n1/(p+1)(p+1)n^{1/(p+1)}-approximation to the optimum set cover. More importantly, we proceed to show that this approximation factor is essentially tight, by showing that a factor better than 0.99n1/(p+1)/(p+1)20.99\,n^{1/(p+1)}/(p+1)^2 is unachievable for a pp-pass semi-streaming algorithm, even allowing randomisation. In particular, this implies that achieving a Θ(logn)\Theta(\log n)-approximation requires Ω(logn/loglogn)\Omega(\log n/\log\log n) passes, which is tight up to the loglogn\log\log n factor. These results extend to a relaxation of the set cover problem where we are allowed to leave an ε\varepsilon fraction of the universe uncovered: the tight bounds on the best approximation factor achievable in pp passes turn out to be Θp(min{n1/(p+1),ε1/p})\Theta_p(\min\{n^{1/(p+1)}, \varepsilon^{-1/p}\}). Our lower bounds are based on a construction of a family of high-rank incidence geometries, which may be thought of as vast generalisations of affine planes. This construction, based on algebraic techniques, appears flexible enough to find other applications and is therefore interesting in its own right.Comment: 20 page

    On the Complexity of Computing an Equilibrium in Combinatorial Auctions

    Full text link
    We study combinatorial auctions where each item is sold separately but simultaneously via a second price auction. We ask whether it is possible to efficiently compute in this game a pure Nash equilibrium with social welfare close to the optimal one. We show that when the valuations of the bidders are submodular, in many interesting settings (e.g., constant number of bidders, budget additive bidders) computing an equilibrium with good welfare is essentially as easy as computing, completely ignoring incentives issues, an allocation with good welfare. On the other hand, for subadditive valuations, we show that computing an equilibrium requires exponential communication. Finally, for XOS (a.k.a. fractionally subadditive) valuations, we show that if there exists an efficient algorithm that finds an equilibrium, it must use techniques that are very different from our current ones

    Towards Tight Bounds for the Streaming Set Cover Problem

    Full text link
    We consider the classic Set Cover problem in the data stream model. For nn elements and mm sets (mnm\geq n) we give a O(1/δ)O(1/\delta)-pass algorithm with a strongly sub-linear O~(mnδ)\tilde{O}(mn^{\delta}) space and logarithmic approximation factor. This yields a significant improvement over the earlier algorithm of Demaine et al. [DIMV14] that uses exponentially larger number of passes. We complement this result by showing that the tradeoff between the number of passes and space exhibited by our algorithm is tight, at least when the approximation factor is equal to 11. Specifically, we show that any algorithm that computes set cover exactly using (12δ1)({1 \over 2\delta}-1) passes must use Ω~(mnδ)\tilde{\Omega}(mn^{\delta}) space in the regime of m=O(n)m=O(n). Furthermore, we consider the problem in the geometric setting where the elements are points in R2\mathbb{R}^2 and sets are either discs, axis-parallel rectangles, or fat triangles in the plane, and show that our algorithm (with a slight modification) uses the optimal O~(n)\tilde{O}(n) space to find a logarithmic approximation in O(1/δ)O(1/\delta) passes. Finally, we show that any randomized one-pass algorithm that distinguishes between covers of size 2 and 3 must use a linear (i.e., Ω(mn)\Omega(mn)) amount of space. This is the first result showing that a randomized, approximate algorithm cannot achieve a space bound that is sublinear in the input size. This indicates that using multiple passes might be necessary in order to achieve sub-linear space bounds for this problem while guaranteeing small approximation factors.Comment: A preliminary version of this paper is to appear in PODS 201

    Combinatorial Auctions Do Need Modest Interaction

    Full text link
    We study the necessity of interaction for obtaining efficient allocations in subadditive combinatorial auctions. This problem was originally introduced by Dobzinski, Nisan, and Oren (STOC'14) as the following simple market scenario: mm items are to be allocated among nn bidders in a distributed setting where bidders valuations are private and hence communication is needed to obtain an efficient allocation. The communication happens in rounds: in each round, each bidder, simultaneously with others, broadcasts a message to all parties involved and the central planner computes an allocation solely based on the communicated messages. Dobzinski et.al. showed that no non-interactive (11-round) protocol with polynomial communication (in the number of items and bidders) can achieve approximation ratio better than Ω(m1/4)\Omega(m^{{1}/{4}}), while for any r1r \geq 1, there exists rr-round protocols that achieve O~(rm1/r+1)\widetilde{O}(r \cdot m^{{1}/{r+1}}) approximation with polynomial communication; in particular, O(logm)O(\log{m}) rounds of interaction suffice to obtain an (almost) efficient allocation. A natural question at this point is to identify the "right" level of interaction (i.e., number of rounds) necessary to obtain an efficient allocation. In this paper, we resolve this question by providing an almost tight round-approximation tradeoff for this problem: we show that for any r1r \geq 1, any rr-round protocol that uses polynomial communication can only approximate the social welfare up to a factor of Ω(1rm1/2r+1)\Omega(\frac{1}{r} \cdot m^{{1}/{2r+1}}). This in particular implies that Ω(logmloglogm)\Omega(\frac{\log{m}}{\log\log{m}}) rounds of interaction are necessary for obtaining any efficient allocation in these markets. Our work builds on the recent multi-party round-elimination technique of Alon, Nisan, Raz, and Weinstein (FOCS'15) and settles an open question posed by Dobzinski et.al. and Alon et. al

    Inapproximability of Truthful Mechanisms via Generalizations of the VC Dimension

    Full text link
    Algorithmic mechanism design (AMD) studies the delicate interplay between computational efficiency, truthfulness, and optimality. We focus on AMD's paradigmatic problem: combinatorial auctions. We present a new generalization of the VC dimension to multivalued collections of functions, which encompasses the classical VC dimension, Natarajan dimension, and Steele dimension. We present a corresponding generalization of the Sauer-Shelah Lemma and harness this VC machinery to establish inapproximability results for deterministic truthful mechanisms. Our results essentially unify all inapproximability results for deterministic truthful mechanisms for combinatorial auctions to date and establish new separation gaps between truthful and non-truthful algorithms
    corecore