3,662 research outputs found

    Fast Iterative Combinatorial Auctions via Bayesian Learning

    Full text link
    Iterative combinatorial auctions (CAs) are often used in multi-billion dollar domains like spectrum auctions, and speed of convergence is one of the crucial factors behind the choice of a specific design for practical applications. To achieve fast convergence, current CAs require careful tuning of the price update rule to balance convergence speed and allocative efficiency. Brero and Lahaie (2018) recently introduced a Bayesian iterative auction design for settings with single-minded bidders. The Bayesian approach allowed them to incorporate prior knowledge into the price update algorithm, reducing the number of rounds to convergence with minimal parameter tuning. In this paper, we generalize their work to settings with no restrictions on bidder valuations. We introduce a new Bayesian CA design for this general setting which uses Monte Carlo Expectation Maximization to update prices at each round of the auction. We evaluate our approach via simulations on CATS instances. Our results show that our Bayesian CA outperforms even a highly optimized benchmark in terms of clearing percentage and convergence speed.Comment: 9 pages, 2 figures, AAAI-1

    EX ANTE NON-MARKET VALUATION FOR NOVEL PRODUCT: LITERATURE REVIEW

    Get PDF
    This paper provides a critical review of the literature on non-market valuation methods to estimate the welfare impact of novel products; it is the first study to assess both observed data- and perception-based methods as non-market valuation methods. Observed databased methods include budgets, regression, mathematical programming, and simulation. Perceptions-based methods include the contingent valuation method, choice-based conjoint analysis and experimental methods. Findings imply that the preferred observed data-based method to estimate the ex ante economic impact of a new technology on the welfare of the farm household is a combination of simulation and mathematical programming. The preferred perceptionbased method for estimating the ex ante impact of a novel product on the welfare of an economic agent is represented by experimental methods. Findings also imply that observed-data based methods and more specifically mathematical programming are more popular for estimating the ex ante farm-level economic impact of a new technology. On the other hand, perception-based methods are more popular for estimating the economic impact of a novel product for consumers.Staff working papers, Dept. of Agricultural Economics, Internet publications, Purdue University

    Third-Party Data Providers Ruin Simple Mechanisms

    Get PDF
    Motivated by the growing prominence of third-party data providers in online marketplaces, this paper studies the impact of the presence of third-party data providers on mechanism design. When no data provider is present, it has been shown that simple mechanisms are "good enough" -- they can achieve a constant fraction of the revenue of optimal mechanisms. The results in this paper demonstrate that this is no longer true in the presence of a third-party data provider who can provide the bidder with a signal that is correlated with the item type. Specifically, even with a single seller, a single bidder, and a single item of uncertain type for sale, the strategies of pricing each item-type separately (the analog of item pricing for multi-item auctions) and bundling all item-types under a single price (the analog of grand bundling) can both simultaneously be a logarithmic factor worse than the optimal revenue. Further, in the presence of a data provider, item-type partitioning mechanisms---a more general class of mechanisms which divide item-types into disjoint groups and offer prices for each group---still cannot achieve within a log⁥log⁥\log \log factor of the optimal revenue. Thus, our results highlight that the presence of a data-provider forces the use of more complicated mechanisms in order to achieve a constant fraction of the optimal revenue

    Computer-aided verification in mechanism design

    Full text link
    In mechanism design, the gold standard solution concepts are dominant strategy incentive compatibility and Bayesian incentive compatibility. These solution concepts relieve the (possibly unsophisticated) bidders from the need to engage in complicated strategizing. While incentive properties are simple to state, their proofs are specific to the mechanism and can be quite complex. This raises two concerns. From a practical perspective, checking a complex proof can be a tedious process, often requiring experts knowledgeable in mechanism design. Furthermore, from a modeling perspective, if unsophisticated agents are unconvinced of incentive properties, they may strategize in unpredictable ways. To address both concerns, we explore techniques from computer-aided verification to construct formal proofs of incentive properties. Because formal proofs can be automatically checked, agents do not need to manually check the properties, or even understand the proof. To demonstrate, we present the verification of a sophisticated mechanism: the generic reduction from Bayesian incentive compatible mechanism design to algorithm design given by Hartline, Kleinberg, and Malekian. This mechanism presents new challenges for formal verification, including essential use of randomness from both the execution of the mechanism and from the prior type distributions. As an immediate consequence, our work also formalizes Bayesian incentive compatibility for the entire family of mechanisms derived via this reduction. Finally, as an intermediate step in our formalization, we provide the first formal verification of incentive compatibility for the celebrated Vickrey-Clarke-Groves mechanism

    Quadratic Core-Selecting Payment Rules for Combinatorial Auctions

    Get PDF
    We report on the use of a quadratic programming technique in recent and upcoming spectrum auctions in Europe. Specifically, we compute a unique point in the core that minimizes the sum of squared deviations from a reference point, for example, from the Vickrey-Clarke-Groves payments. Analyzing the Karush-Kuhn-Tucker conditions, we demonstrate that the resulting payments can be decomposed into a series of economically meaningful and equitable penalties. Furthermore, we discuss the benefits of this combinatorial auction, explore the use of alternative reserve pricing approaches in this context, and indicate the results of several hundred computational runs using CATS data.Auctions, spectrum auctions, market design, package auction, clock auction, combinatorial auction

    Algorithms as Mechanisms: The Price of Anarchy of Relax-and-Round

    Full text link
    Many algorithms that are originally designed without explicitly considering incentive properties are later combined with simple pricing rules and used as mechanisms. The resulting mechanisms are often natural and simple to understand. But how good are these algorithms as mechanisms? Truthful reporting of valuations is typically not a dominant strategy (certainly not with a pay-your-bid, first-price rule, but it is likely not a good strategy even with a critical value, or second-price style rule either). Our goal is to show that a wide class of approximation algorithms yields this way mechanisms with low Price of Anarchy. The seminal result of Lucier and Borodin [SODA 2010] shows that combining a greedy algorithm that is an α\alpha-approximation algorithm with a pay-your-bid payment rule yields a mechanism whose Price of Anarchy is O(α)O(\alpha). In this paper we significantly extend the class of algorithms for which such a result is available by showing that this close connection between approximation ratio on the one hand and Price of Anarchy on the other also holds for the design principle of relaxation and rounding provided that the relaxation is smooth and the rounding is oblivious. We demonstrate the far-reaching consequences of our result by showing its implications for sparse packing integer programs, such as multi-unit auctions and generalized matching, for the maximum traveling salesman problem, for combinatorial auctions, and for single source unsplittable flow problems. In all these problems our approach leads to novel simple, near-optimal mechanisms whose Price of Anarchy either matches or beats the performance guarantees of known mechanisms.Comment: Extended abstract appeared in Proc. of 16th ACM Conference on Economics and Computation (EC'15

    Designing Coalition-Proof Reverse Auctions over Continuous Goods

    Full text link
    This paper investigates reverse auctions that involve continuous values of different types of goods, general nonconvex constraints, and second stage costs. We seek to design the payment rules and conditions under which coalitions of participants cannot influence the auction outcome in order to obtain higher collective utility. Under the incentive-compatible Vickrey-Clarke-Groves mechanism, we show that coalition-proof outcomes are achieved if the submitted bids are convex and the constraint sets are of a polymatroid-type. These conditions, however, do not capture the complexity of the general class of reverse auctions under consideration. By relaxing the property of incentive-compatibility, we investigate further payment rules that are coalition-proof without any extra conditions on the submitted bids and the constraint sets. Since calculating the payments directly for these mechanisms is computationally difficult for auctions involving many participants, we present two computationally efficient methods. Our results are verified with several case studies based on electricity market data
    • 

    corecore