936 research outputs found
Paradoxes of Fair Division
Two or more players are required to divide up a set of indivisible items that they can rank from best to worst. They may, as well, be able to indicate preferences over subsets, or packages, of items. The main criteria used to assess the fairness of a division are efficiency (Pareto-optimality) and envy-freeness. Other criteria are also suggested, including a Rawlsian criterion that the worst-off player be made as well off as possible and a scoring procedure, based on the Borda count, that helps to render allocations as equal as possible. Eight paradoxes, all of which involve unexpected conflicts among the criteria, are described and classified into three categories, reflecting (1) incompatibilities between efficiency and envy-freeness, (2) the failure of a unique efficient and envy-free division to satisfy other criteria, and (3) the desirability, on occasion, of dividing up items unequally. While troublesome, the paradoxes also indicate opportunities for achieving fair division, which will depend on the fairness criteria one deems important and the trade-offs one considers acceptable.FAIR DIVISION; ALLOCATION OF INDIVISIBLE ITEMS; ENVY-FREENESS; PARETO- OPTIMALITY; RAWLSIAN JUSTICE; BORDA COUNT.
Fair Division of Indivisible Items
This paper analyzes criteria of fair division of a set of indivisible items among people whose revealed preferences are limited to rankings of the items and for whom no side payments are allowed. The criteria include refinements of Pareto optimality and envy-freeness as well as dominance-freeness, evenness of shares, and two criteria based on equally-spaced surrogate utilities, referred to as maxsum and equimax. Maxsum maximizes a measure of aggregate utility or welfare, whereas equimax lexicographically maximizes persons' utilities from smallest to largest. The paper analyzes conflicts among the criteria along possibilities and pitfalls of achieving fair division in a variety of circumstances.FAIR DIVISION; ALLOCATION OF INDIVISIBLE ITEMS; PARETO OPTIMALITY; ENVY-FREENESS; LEXICOGRAPHIC MAXIMUM
Sets Uniquely Determined by Projections on Axes I. Continuous Case
This paper studies sets S in Rn which are uniquely reconstructible from their hyperplane integral projections Pi(xi ;S) = ⏠. . . â«Î§S ( {x1, . . . ,xi, . . . ,xn) dx1 . . . dxi - 1 dxi + 1 . . .dxn onto the n coordinate axes of Rn. It is shown that any additive set S = {x = (x1, . . .,xn) : âi = 1n fi(xi)â§0}, where each fi(xi) is a bounded measurable function, is uniquely reconstructible. In particular, balls are uniquely reconstructible. It is shown that in R2 all uniquely reconstructible sets are additive. For nâ§3, Kemperman has shown that there are uniquely reconstructible sets in Rn of bounded measure that are not additive. It is also noted for nâ§3 that neither of the properties of being additive and being a set of uniqueness is closed under monotone pointwise limits.
A necessary condition for S to be a set of uniqueness is that S contain no bad configuration. A bad configuration is two finite sets of points T1 in Int(S) and T2 in Int(Sc), where Sc=Rn - S, such that T1 and T2 have the same number of points in any hyperplane xi = c for 1⊠i âŠn, and all c â R2. We show that this necessary condition is sufficient for uniqueness for open sets S in R2.
The results show that prior information about a density f in R2 to be reconstructed in tomography (namely if f is known to have only values 0 and 1) can sometimes reduce the problem of reconstructing f to knowing only two projections of f. Thus even meager prior information can in principle be of enormous value in tomography
Downside risk in reservoir management
Downside risk, which refers to deviations below a threshold, is often important in
water management decisions, especially in areas with large and skewed variations in
precipitation patterns. In this paper, we present a model for a reservoir manager who
is downside risk averse and who performs a dynamic allocation of irrigation water,
taking into account the negative effects of droughts on farm profits and different
environmental constraints. We analyse the water stock, flows and agricultural profits
for alternative environmental restrictions and thresholds for irrigation levels and find
that stricter environmental constraints increase total water supply and carryover
stock, while higher penalty thresholds lead to their overall decrease. Furthermore,
increasing penalty thresholds leads to a higher emphasis on avoiding shortages, at the
expense of lower average profits.info:eu-repo/semantics/acceptedVersio
Under stochastic dominance Choquet-expected utility and anticipated utility are identical
The aim of this paper is to convince the reader that Choquet-expected utility, as initiated by Schmeidler (1982, 1989) for decision making under uncertainty, when formulated for decision making under risk naturally leads to anticipated utility, as initiated by Quiggin/Yaari. Thus the two generalizations of expected utility in fact are one
Processing second-order stochastic dominance models using cutting-plane representations
This is the post-print version of the Article. The official published version can be accessed from the links below. Copyright @ 2011 Springer-VerlagSecond-order stochastic dominance (SSD) is widely recognised as an important decision criterion in portfolio selection. Unfortunately, stochastic dominance models are known to be very demanding from a computational point of view. In this paper we consider two classes of models which use SSD as a choice criterion. The first, proposed by Dentcheva and RuszczyĆski (J Bank Finance 30:433â451, 2006), uses a SSD constraint, which can be expressed as integrated chance constraints (ICCs). The second, proposed by Roman et al. (Math Program, Ser B 108:541â569, 2006) uses SSD through a multi-objective formulation with CVaR objectives. Cutting plane representations and algorithms were proposed by Klein Haneveld and Van der Vlerk (Comput Manage Sci 3:245â269, 2006) for ICCs, and by KĂŒnzi-Bay and Mayer (Comput Manage Sci 3:3â27, 2006) for CVaR minimization. These concepts are taken into consideration to propose representations and solution methods for the above class of SSD based models. We describe a cutting plane based solution algorithm and outline implementation details. A computational study is presented, which demonstrates the effectiveness and the scale-up properties of the solution algorithm, as applied to the SSD model of Roman et al. (Math Program, Ser B 108:541â569, 2006).This study was funded by OTKA, Hungarian
National Fund for Scientific Research, project 47340; by Mobile Innovation Centre, Budapest University of Technology, project 2.2; Optirisk Systems, Uxbridge, UK and by BRIEF (Brunel University Research Innovation and Enterprise Fund)
First-Digit Law in Nonextensive Statistics
Nonextensive statistics, characterized by a nonextensive parameter , is a
promising and practically useful generalization of the Boltzmann statistics to
describe power-law behaviors from physical and social observations. We here
explore the unevenness of the first digit distribution of nonextensive
statistics analytically and numerically. We find that the first-digit
distribution follows Benford's law and fluctuates slightly in a periodical
manner with respect to the logarithm of the temperature. The fluctuation
decreases when increases, and the result converges to Benford's law exactly
as approaches 2. The relevant regularities between nonextensive statistics
and Benford's law are also presented and discussed.Comment: 11 pages, 3 figures, published in Phys. Rev.
Mean-risk models using two risk measures: A multi-objective approach
This paper proposes a model for portfolio optimisation, in which distributions are characterised and compared on the basis of three statistics: the expected value, the variance and the CVaR at a specified confidence level. The problem is multi-objective and transformed into a single objective problem in which variance is minimised while constraints are imposed on the expected value and CVaR. In the case of discrete random variables, the problem is a quadratic program. The mean-variance (mean-CVaR) efficient solutions that are not dominated with respect to CVaR (variance) are particular efficient solutions of the proposed model. In addition, the model has efficient solutions that are discarded by both mean-variance and mean-CVaR models, although they may improve the return distribution. The model is tested on real data drawn from the FTSE 100 index. An analysis of the return distribution of the chosen portfolios is presented
Testing the bounds on quantum probabilities
Bounds on quantum probabilities and expectation values are derived for
experimental setups associated with Bell-type inequalities. In analogy to the
classical bounds, the quantum limits are experimentally testable and therefore
serve as criteria for the validity of quantum mechanics.Comment: 9 pages, Revte
Statistical mechanics of voting
Decision procedures aggregating the preferences of multiple agents can
produce cycles and hence outcomes which have been described heuristically as
`chaotic'. We make this description precise by constructing an explicit
dynamical system from the agents' preferences and a voting rule. The dynamics
form a one dimensional statistical mechanics model; this suggests the use of
the topological entropy to quantify the complexity of the system. We formulate
natural political/social questions about the expected complexity of a voting
rule and degree of cohesion/diversity among agents in terms of random matrix
models---ensembles of statistical mechanics models---and compute quantitative
answers in some representative cases.Comment: 9 pages, plain TeX, 2 PostScript figures included with epsf.tex
(ignore the under/overfull \vbox error messages
- âŠ