37 research outputs found

    On Revenue Monotonicity in Combinatorial Auctions

    Full text link
    Along with substantial progress made recently in designing near-optimal mechanisms for multi-item auctions, interesting structural questions have also been raised and studied. In particular, is it true that the seller can always extract more revenue from a market where the buyers value the items higher than another market? In this paper we obtain such a revenue monotonicity result in a general setting. Precisely, consider the revenue-maximizing combinatorial auction for mm items and nn buyers in the Bayesian setting, specified by a valuation function vv and a set FF of nmnm independent item-type distributions. Let REV(v,F)REV(v, F) denote the maximum revenue achievable under FF by any incentive compatible mechanism. Intuitively, one would expect that REV(v,G)REV(v,F)REV(v, G)\geq REV(v, F) if distribution GG stochastically dominates FF. Surprisingly, Hart and Reny (2012) showed that this is not always true even for the simple case when vv is additive. A natural question arises: Are these deviations contained within bounds? To what extent may the monotonicity intuition still be valid? We present an {approximate monotonicity} theorem for the class of fractionally subadditive (XOS) valuation functions vv, showing that REV(v,G)cREV(v,F)REV(v, G)\geq c\,REV(v, F) if GG stochastically dominates FF under vv where c>0c>0 is a universal constant. Previously, approximate monotonicity was known only for the case n=1n=1: Babaioff et al. (2014) for the class of additive valuations, and Rubinstein and Weinberg (2015) for all subaddtive valuation functions.Comment: 10 page

    Maximizing System Throughput Using Cooperative Sensing in Multi-Channel Cognitive Radio Networks

    Full text link
    In Cognitive Radio Networks (CRNs), unlicensed users are allowed to access the licensed spectrum when it is not currently being used by primary users (PUs). In this paper, we study the throughput maximization problem for a multi-channel CRN where each SU can only sense a limited number of channels. We show that this problem is strongly NP-hard, and propose an approximation algorithm with a factor at least 1/2μ1/2\mu where μ[1,2]\mu \in [1,2] is a system parameter reflecting the sensing capability of SUs across channels and their sensing budgets. This performance guarantee is achieved by exploiting a nice structural property of the objective function and constructing a particular matching. Our numerical results demonstrate the advantage of our algorithm compared with both a random and a greedy sensing assignment algorithm

    Single Parameter Combinatorial Auctions with Partially Public Valuations

    Full text link
    We consider the problem of designing truthful auctions, when the bidders' valuations have a public and a private component. In particular, we consider combinatorial auctions where the valuation of an agent ii for a set SS of items can be expressed as vif(S)v_if(S), where viv_i is a private single parameter of the agent, and the function ff is publicly known. Our motivation behind studying this problem is two-fold: (a) Such valuation functions arise naturally in the case of ad-slots in broadcast media such as Television and Radio. For an ad shown in a set SS of ad-slots, f(S)f(S) is, say, the number of {\em unique} viewers reached by the ad, and viv_i is the valuation per-unique-viewer. (b) From a theoretical point of view, this factorization of the valuation function simplifies the bidding language, and renders the combinatorial auction more amenable to better approximation factors. We present a general technique, based on maximal-in-range mechanisms, that converts any α\alpha-approximation non-truthful algorithm (α1\alpha \leq 1) for this problem into Ω(αlogn)\Omega(\frac{\alpha}{\log{n}}) and Ω(α)\Omega(\alpha)-approximate truthful mechanisms which run in polynomial time and quasi-polynomial time, respectively

    Enabling Privacy-preserving Auctions in Big Data

    Full text link
    We study how to enable auctions in the big data context to solve many upcoming data-based decision problems in the near future. We consider the characteristics of the big data including, but not limited to, velocity, volume, variety, and veracity, and we believe any auction mechanism design in the future should take the following factors into consideration: 1) generality (variety); 2) efficiency and scalability (velocity and volume); 3) truthfulness and verifiability (veracity). In this paper, we propose a privacy-preserving construction for auction mechanism design in the big data, which prevents adversaries from learning unnecessary information except those implied in the valid output of the auction. More specifically, we considered one of the most general form of the auction (to deal with the variety), and greatly improved the the efficiency and scalability by approximating the NP-hard problems and avoiding the design based on garbled circuits (to deal with velocity and volume), and finally prevented stakeholders from lying to each other for their own benefit (to deal with the veracity). We achieve these by introducing a novel privacy-preserving winner determination algorithm and a novel payment mechanism. Additionally, we further employ a blind signature scheme as a building block to let bidders verify the authenticity of their payment reported by the auctioneer. The comparison with peer work shows that we improve the asymptotic performance of peer works' overhead from the exponential growth to a linear growth and from linear growth to a logarithmic growth, which greatly improves the scalability

    Bayesian Incentive Compatibility via Fractional Assignments

    Full text link
    Very recently, Hartline and Lucier studied single-parameter mechanism design problems in the Bayesian setting. They proposed a black-box reduction that converted Bayesian approximation algorithms into Bayesian-Incentive-Compatible (BIC) mechanisms while preserving social welfare. It remains a major open question if one can find similar reduction in the more important multi-parameter setting. In this paper, we give positive answer to this question when the prior distribution has finite and small support. We propose a black-box reduction for designing BIC multi-parameter mechanisms. The reduction converts any algorithm into an eps-BIC mechanism with only marginal loss in social welfare. As a result, for combinatorial auctions with sub-additive agents we get an eps-BIC mechanism that achieves constant approximation.Comment: 22 pages, 1 figur

    Prophet Secretary for Combinatorial Auctions and Matroids

    Full text link
    The secretary and the prophet inequality problems are central to the field of Stopping Theory. Recently, there has been a lot of work in generalizing these models to multiple items because of their applications in mechanism design. The most important of these generalizations are to matroids and to combinatorial auctions (extends bipartite matching). Kleinberg-Weinberg \cite{KW-STOC12} and Feldman et al. \cite{feldman2015combinatorial} show that for adversarial arrival order of random variables the optimal prophet inequalities give a 1/21/2-approximation. For many settings, however, it's conceivable that the arrival order is chosen uniformly at random, akin to the secretary problem. For such a random arrival model, we improve upon the 1/21/2-approximation and obtain (11/e)(1-1/e)-approximation prophet inequalities for both matroids and combinatorial auctions. This also gives improvements to the results of Yan \cite{yan2011mechanism} and Esfandiari et al. \cite{esfandiari2015prophet} who worked in the special cases where we can fully control the arrival order or when there is only a single item. Our techniques are threshold based. We convert our discrete problem into a continuous setting and then give a generic template on how to dynamically adjust these thresholds to lower bound the expected total welfare.Comment: Preliminary version appeared in SODA 2018. This version improves the writeup on Fixed-Threshold algorithm
    corecore