176 research outputs found

    On Nash Dynamics of Matching Market Equilibria

    Full text link
    In this paper, we study the Nash dynamics of strategic interplays of n buyers in a matching market setup by a seller, the market maker. Taking the standard market equilibrium approach, upon receiving submitted bid vectors from the buyers, the market maker will decide on a price vector to clear the market in such a way that each buyer is allocated an item for which he desires the most (a.k.a., a market equilibrium solution). While such equilibrium outcomes are not unique, the market maker chooses one (maxeq) that optimizes its own objective --- revenue maximization. The buyers in turn change bids to their best interests in order to obtain higher utilities in the next round's market equilibrium solution. This is an (n+1)-person game where buyers place strategic bids to gain the most from the market maker's equilibrium mechanism. The incentives of buyers in deciding their bids and the market maker's choice of using the maxeq mechanism create a wave of Nash dynamics involved in the market. We characterize Nash equilibria in the dynamics in terms of the relationship between maxeq and mineq (i.e., minimum revenue equilibrium), and develop convergence results for Nash dynamics from the maxeq policy to a mineq solution, resulting an outcome equivalent to the truthful VCG mechanism. Our results imply revenue equivalence between maxeq and mineq, and address the question that why short-term revenue maximization is a poor long run strategy, in a deterministic and dynamic setting

    f(R)f(R) gravity theories in the Palatini Formalism constrained from strong lensing

    Full text link
    f(R)f(R) gravity, capable of driving the late-time acceleration of the universe, is emerging as a promising alternative to dark energy. Various f(R)f(R) gravity models have been intensively tested against probes of the expansion history, including type Ia supernovae (SNIa), the cosmic microwave background (CMB) and baryon acoustic oscillations (BAO). In this paper we propose to use the statistical lens sample from Sloan Digital Sky Survey Quasar Lens Search Data Release 3 (SQLS DR3) to constrain f(R)f(R) gravity models. This sample can probe the expansion history up to z∼2.2z\sim2.2, higher than what probed by current SNIa and BAO data. We adopt a typical parameterization of the form f(R)=R−αH02(−RH02)βf(R)=R-\alpha H^2_0(-\frac{R}{H^2_0})^\beta with α\alpha and β\beta constants. For β=0\beta=0 (Λ\LambdaCDM), we obtain the best-fit value of the parameter α=−4.193\alpha=-4.193, for which the 95% confidence interval that is [-4.633, -3.754]. This best-fit value of α\alpha corresponds to the matter density parameter Ωm0=0.301\Omega_{m0}=0.301, consistent with constraints from other probes. Allowing β\beta to be free, the best-fit parameters are (α,β)=(−3.777,0.06195)(\alpha, \beta)=(-3.777, 0.06195). Consequently, we give Ωm0=0.285\Omega_{m0}=0.285 and the deceleration parameter q0=−0.544q_0=-0.544. At the 95% confidence level, α\alpha and β\beta are constrained to [-4.67, -2.89] and [-0.078, 0.202] respectively. Clearly, given the currently limited sample size, we can only constrain β\beta within the accuracy of Δβ∼0.1\Delta\beta\sim 0.1 and thus can not distinguish between Λ\LambdaCDM and f(R)f(R) gravity with high significance, and actually, the former lies in the 68% confidence contour. We expect that the extension of the SQLS DR3 lens sample to the SDSS DR5 and SDSS-II will make constraints on the model more stringent.Comment: 10 pages, 7 figures. Accepted for publication in MNRA

    On Revenue Maximization with Sharp Multi-Unit Demands

    Full text link
    We consider markets consisting of a set of indivisible items, and buyers that have {\em sharp} multi-unit demand. This means that each buyer ii wants a specific number did_i of items; a bundle of size less than did_i has no value, while a bundle of size greater than did_i is worth no more than the most valued did_i items (valuations being additive). We consider the objective of setting prices and allocations in order to maximize the total revenue of the market maker. The pricing problem with sharp multi-unit demand buyers has a number of properties that the unit-demand model does not possess, and is an important question in algorithmic pricing. We consider the problem of computing a revenue maximizing solution for two solution concepts: competitive equilibrium and envy-free pricing. For unrestricted valuations, these problems are NP-complete; we focus on a realistic special case of "correlated values" where each buyer ii has a valuation v_i\qual_j for item jj, where viv_i and \qual_j are positive quantities associated with buyer ii and item jj respectively. We present a polynomial time algorithm to solve the revenue-maximizing competitive equilibrium problem. For envy-free pricing, if the demand of each buyer is bounded by a constant, a revenue maximizing solution can be found efficiently; the general demand case is shown to be NP-hard.Comment: page2

    Pricing Ad Slots with Consecutive Multi-unit Demand

    Full text link
    We consider the optimal pricing problem for a model of the rich media advertisement market, as well as other related applications. In this market, there are multiple buyers (advertisers), and items (slots) that are arranged in a line such as a banner on a website. Each buyer desires a particular number of {\em consecutive} slots and has a per-unit-quality value viv_i (dependent on the ad only) while each slot jj has a quality qjq_j (dependent on the position only such as click-through rate in position auctions). Hence, the valuation of the buyer ii for item jj is viqjv_iq_j. We want to decide the allocations and the prices in order to maximize the total revenue of the market maker. A key difference from the traditional position auction is the advertiser's requirement of a fixed number of consecutive slots. Consecutive slots may be needed for a large size rich media ad. We study three major pricing mechanisms, the Bayesian pricing model, the maximum revenue market equilibrium model and an envy-free solution model. Under the Bayesian model, we design a polynomial time computable truthful mechanism which is optimum in revenue. For the market equilibrium paradigm, we find a polynomial time algorithm to obtain the maximum revenue market equilibrium solution. In envy-free settings, an optimal solution is presented when the buyers have the same demand for the number of consecutive slots. We conduct a simulation that compares the revenues from the above schemes and gives convincing results.Comment: 27page

    Are Equivariant Equilibrium Approximators Beneficial?

    Full text link
    Recently, remarkable progress has been made by approximating Nash equilibrium (NE), correlated equilibrium (CE), and coarse correlated equilibrium (CCE) through function approximation that trains a neural network to predict equilibria from game representations. Furthermore, equivariant architectures are widely adopted in designing such equilibrium approximators in normal-form games. In this paper, we theoretically characterize benefits and limitations of equivariant equilibrium approximators. For the benefits, we show that they enjoy better generalizability than general ones and can achieve better approximations when the payoff distribution is permutation-invariant. For the limitations, we discuss their drawbacks in terms of equilibrium selection and social welfare. Together, our results help to understand the role of equivariance in equilibrium approximators.Comment: To appear in ICML 202

    Smoothed and Average-Case Approximation Ratios of Mechanisms: Beyond the Worst-Case Analysis

    Get PDF
    The approximation ratio has become one of the dominant measures in mechanism design problems. In light of analysis of algorithms, we define the smoothed approximation ratio to compare the performance of the optimal mechanism and a truthful mechanism when the inputs are subject to random perturbations of the worst-case inputs, and define the average-case approximation ratio to compare the performance of these two mechanisms when the inputs follow a distribution. For the one-sided matching problem, Filos-Ratsikas et al. [2014] show that, amongst all truthful mechanisms, random priority achieves the tight approximation ratio bound of Theta(sqrt{n}). We prove that, despite of this worst-case bound, random priority has a constant smoothed approximation ratio. This is, to our limited knowledge, the first work that asymptotically differentiates the smoothed approximation ratio from the worst-case approximation ratio for mechanism design problems. For the average-case, we show that our approximation ratio can be improved to 1+e. These results partially explain why random priority has been successfully used in practice, although in the worst case the optimal social welfare is Theta(sqrt{n}) times of what random priority achieves. These results also pave the way for further studies of smoothed and average-case analysis for approximate mechanism design problems, beyond the worst-case analysis

    On the complexity of price equilibria

    Get PDF
    AbstractWe prove complexity, approximability, and inapproximability results for the problem of finding an exchange equilibrium in markets with indivisible (integer) goods, most notably a polynomial algorithm that approximates the market equilibrium arbitrarily close when the number of goods is bounded and the utilities linear. We also show a communication complexity lower bound in a model appropriate for markets. Our result implies that the ideal informational economy of a market with divisible goods and unique optimal allocations is unattainable in general
    • …
    corecore