1,748 research outputs found
The Combinatorial World (of Auctions) According to GARP
Revealed preference techniques are used to test whether a data set is
compatible with rational behaviour. They are also incorporated as constraints
in mechanism design to encourage truthful behaviour in applications such as
combinatorial auctions. In the auction setting, we present an efficient
combinatorial algorithm to find a virtual valuation function with the optimal
(additive) rationality guarantee. Moreover, we show that there exists such a
valuation function that both is individually rational and is minimum (that is,
it is component-wise dominated by any other individually rational, virtual
valuation function that approximately fits the data). Similarly, given upper
bound constraints on the valuation function, we show how to fit the maximum
virtual valuation function with the optimal additive rationality guarantee. In
practice, revealed preference bidding constraints are very demanding. We
explain how approximate rationality can be used to create relaxed revealed
preference constraints in an auction. We then show how combinatorial methods
can be used to implement these relaxed constraints. Worst/best-case welfare
guarantees that result from the use of such mechanisms can be quantified via
the minimum/maximum virtual valuation function
The Attack-and-Defense Group Contests: Best-shot versus Weakest-link
This study analyzes a group contest in which one group (defenders) follows a weakest-link whereas the other group (attackers) follows a best-shot impact function. We fully characterize the Nash and coalition-proof equilibria and show that with symmetric valuation the coalition-proof equilibrium is unique up to the permutation of the identity of the active player in the attacker group. With asymmetric valuation it is always an equilibrium for one of the highest valuation players to be active; it may also be the case that the highest valuation players in the attacker group free-ride completely on a group-member with a lower valuation. However, in any equilibrium, only one player in the attacker group is active, whereas all the players in the defender group are active and exert the same effort. We also characterize the Nash and coalition-proof equilibria for the case in which one group follows either a best-shot or a weakest-link but the other group follows an additive impact function
Competitive market for multiple firms and economic crisis
The origin of economic crises is a key problem for economics. We present a
model of long-run competitive markets to show that the multiplicity of
behaviors in an economic system, over a long time scale, emerge as statistical
regularities (perfectly competitive markets obey Bose-Einstein statistics and
purely monopolistic-competitive markets obey Boltzmann statistics) and that how
interaction among firms influences the evolutionary of competitive markets. It
has been widely accepted that perfect competition is most efficient. Our study
shows that the perfectly competitive system, as an extreme case of competitive
markets, is most efficient but not stable, and gives rise to economic crises as
society reaches full employment. In the economic crisis revealed by our model,
many firms condense (collapse) into the lowest supply level (zero supply,
namely bankruptcy status), in analogy to Bose-Einstein condensation. This
curious phenomenon arises because perfect competition (homogeneous
competitions) equals symmetric (indistinguishable) investment direction, a fact
abhorred by nature. Therefore, we urge the promotion of monopolistic
competition (heterogeneous competitions) rather than perfect competition. To
provide early warning of economic crises, we introduce a resolving index of
investment, which approaches zero in the run-up to an economic crisis. On the
other hand, our model discloses, as a profound conclusion, that the
technological level for a long-run social or economic system is proportional to
the freedom (disorder) of this system; in other words, technology equals the
entropy of system. As an application of this new concept, we give a possible
answer to the Needham question: "Why was it that despite the immense
achievements of traditional China it had been in Europe and not in China that
the scientific and industrial revolutions occurred?"Comment: 17 pages; 3 figure
Testing Consumer Rationality using Perfect Graphs and Oriented Discs
Given a consumer data-set, the axioms of revealed preference proffer a binary
test for rational behaviour. A natural (non-binary) measure of the degree of
rationality exhibited by the consumer is the minimum number of data points
whose removal induces a rationalisable data-set.We study the computational
complexity of the resultant consumer rationality problem in this paper. This
problem is, in the worst case, equivalent (in terms of approximation) to the
directed feedback vertex set problem. Our main result is to obtain an exact
threshold on the number of commodities that separates easy cases and hard
cases. Specifically, for two-commodity markets the consumer rationality problem
is polynomial time solvable; we prove this via a reduction to the vertex cover
problem on perfect graphs. For three-commodity markets, however, the problem is
NP-complete; we prove thisusing a reduction from planar 3-SAT that is based
upon oriented-disc drawings
Competitive Cross-Subsidization
Cross-subsidization arises naturally when firms with different comparative ad- vantages compete for consumers with diverse shopping patterns. Firms then face a form of co-opetition, being substitutes for one-stop shoppers and complements for multi-stop shoppers. Competition for one-stop shoppers then drives total prices down to cost, but firms subsidize weak products with the profit made on strong products. While firms and consumers would benefit from cooperation limiting cross- subsidization (e.g., through price caps), banning below-cost pricing instead increases firms’ profits at the expense of one-stop shoppers; this calls for a cautious use of below-cost pricing regulations in competitive markets
Nonparametric instrumental regression with non-convex constraints
This paper considers the nonparametric regression model with an additive
error that is dependent on the explanatory variables. As is common in empirical
studies in epidemiology and economics, it also supposes that valid instrumental
variables are observed. A classical example in microeconomics considers the
consumer demand function as a function of the price of goods and the income,
both variables often considered as endogenous. In this framework, the economic
theory also imposes shape restrictions on the demand function, like
integrability conditions. Motivated by this illustration in microeconomics, we
study an estimator of a nonparametric constrained regression function using
instrumental variables by means of Tikhonov regularization. We derive rates of
convergence for the regularized model both in a deterministic and stochastic
setting under the assumption that the true regression function satisfies a
projected source condition including, because of the non-convexity of the
imposed constraints, an additional smallness condition
The Fairness Challenge in Computer Networks
In this paper, the concept of fairness in computer networks is investigated. We motivate the need of examining fairness issues by providing example future application scenarios where fairness support is needed in order to experience sufficient service quality. Fairness definitions from political science and their application to computer networks are described and a state-of-the-art overview of research activities in fairness, from issues such a queue management and tcp-friendliness to issues like fairness in layered multi-rate multicast scenarios, is given. We contribute with this paper to the ongoing research activities by defining the fairness challenge with the purpose of helping direct future investigations to with spots on the map of research in fairness
Sequential pivotal mechanisms for public project problems
It is well-known that for several natural decision problems no budget
balanced Groves mechanisms exist. This has motivated recent research on
designing variants of feasible Groves mechanisms (termed as `redistribution of
VCG (Vickrey-Clarke-Groves) payments') that generate reduced deficit. With this
in mind, we study sequential mechanisms and consider optimal strategies that
could reduce the deficit resulting under the simultaneous mechanism. We show
that such strategies exist for the sequential pivotal mechanism of the
well-known public project problem. We also exhibit an optimal strategy with the
property that a maximal social welfare is generated when each player follows
it. Finally, we show that these strategies can be achieved by an implementation
in Nash equilibrium.Comment: 19 pages. The version without the appendix will appear in the Proc.
2nd International Symposium on Algorithmic Game Theory, 200
- …