1,592 research outputs found
Fair Division of Indivisible Items
This paper analyzes criteria of fair division of a set of indivisible items among people whose revealed preferences are limited to rankings of the items and for whom no side payments are allowed. The criteria include refinements of Pareto optimality and envy-freeness as well as dominance-freeness, evenness of shares, and two criteria based on equally-spaced surrogate utilities, referred to as maxsum and equimax. Maxsum maximizes a measure of aggregate utility or welfare, whereas equimax lexicographically maximizes persons' utilities from smallest to largest. The paper analyzes conflicts among the criteria along possibilities and pitfalls of achieving fair division in a variety of circumstances.FAIR DIVISION; ALLOCATION OF INDIVISIBLE ITEMS; PARETO OPTIMALITY; ENVY-FREENESS; LEXICOGRAPHIC MAXIMUM
Paradoxes of Fair Division
Two or more players are required to divide up a set of indivisible items that they can rank from best to worst. They may, as well, be able to indicate preferences over subsets, or packages, of items. The main criteria used to assess the fairness of a division are efficiency (Pareto-optimality) and envy-freeness. Other criteria are also suggested, including a Rawlsian criterion that the worst-off player be made as well off as possible and a scoring procedure, based on the Borda count, that helps to render allocations as equal as possible. Eight paradoxes, all of which involve unexpected conflicts among the criteria, are described and classified into three categories, reflecting (1) incompatibilities between efficiency and envy-freeness, (2) the failure of a unique efficient and envy-free division to satisfy other criteria, and (3) the desirability, on occasion, of dividing up items unequally. While troublesome, the paradoxes also indicate opportunities for achieving fair division, which will depend on the fairness criteria one deems important and the trade-offs one considers acceptable.FAIR DIVISION; ALLOCATION OF INDIVISIBLE ITEMS; ENVY-FREENESS; PARETO- OPTIMALITY; RAWLSIAN JUSTICE; BORDA COUNT.
The size, concentration, and growth of biodiversity-conservation nonprofits
Nonprofit organizations play a critical role in efforts to conserve biodiversity. Their success in this regard will be determined in part by how effectively individual nonprofits and the sector as a whole are structured. One of the most fundamental questions about an organization’s structure is how large it should be, with the logical counterpart being how concentrated the whole sector should be. We review empirical patterns in the size, concentration, and growth of over 1700 biodiversity-conservation nonprofits registered for tax purposes in the United States within the context of relevant economic theory. Conservation-nonprofit sizes vary by six to seven orders of magnitude and are positively skewed. Larger nonprofits access more revenue streams and hold more of their assets in land and buildings than smaller or midsized nonprofits do. The size of conservation nonprofits varies with the ecological focus of the organization, but the growth rates of nonprofits do not
Cognitive constraints, contraction consistency, and the satisficing criterion
© 2007, Elsevier. Licensed under the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 Internationalhttp://creativecommons.org/licenses/by-nc-nd/4.0
Neural correlates of early deliberate emotion regulation: Young children\u27s responses to interpersonal scaffolding.
Deliberate emotion regulation, the ability to willfully modulate emotional experiences, is shaped through interpersonal scaffolding and forecasts later functioning in multiple domains. However, nascent deliberate emotion regulation in early childhood is poorly understood due to a paucity of studies that simulate interpersonal scaffolding of this skill and measure its occurrence in multiple modalities. Our goal was to identify neural and behavioral components of early deliberate emotion regulation to identify patterns of competent and deficient responses. A novel probe was developed to assess deliberate emotion regulation in young children. Sixty children (age 4-6 years) were randomly assigned to deliberate emotion regulation or control conditions. Children completed a frustration task while lateral prefrontal cortex (LPFC) activation was recorded via functional near-infrared spectroscopy (fNIRS). Facial expressions were video recorded and children self-rated their emotions. Parents rated their child\u27s temperamental emotion regulation. Deliberate emotion regulation interpersonal scaffolding predicted a significant increase in frustration-related LPFC activation not seen in controls. Better temperamental emotion regulation predicted larger LPFC activation increases post- scaffolding among children who engaged in deliberate emotion regulation interpersonal scaffolding. A capacity to increase LPFC activation in response to interpersonal scaffolding may be a crucial neural correlate of early deliberate emotion regulation
Processing second-order stochastic dominance models using cutting-plane representations
This is the post-print version of the Article. The official published version can be accessed from the links below. Copyright @ 2011 Springer-VerlagSecond-order stochastic dominance (SSD) is widely recognised as an important decision criterion in portfolio selection. Unfortunately, stochastic dominance models are known to be very demanding from a computational point of view. In this paper we consider two classes of models which use SSD as a choice criterion. The first, proposed by Dentcheva and Ruszczyński (J Bank Finance 30:433–451, 2006), uses a SSD constraint, which can be expressed as integrated chance constraints (ICCs). The second, proposed by Roman et al. (Math Program, Ser B 108:541–569, 2006) uses SSD through a multi-objective formulation with CVaR objectives. Cutting plane representations and algorithms were proposed by Klein Haneveld and Van der Vlerk (Comput Manage Sci 3:245–269, 2006) for ICCs, and by Künzi-Bay and Mayer (Comput Manage Sci 3:3–27, 2006) for CVaR minimization. These concepts are taken into consideration to propose representations and solution methods for the above class of SSD based models. We describe a cutting plane based solution algorithm and outline implementation details. A computational study is presented, which demonstrates the effectiveness and the scale-up properties of the solution algorithm, as applied to the SSD model of Roman et al. (Math Program, Ser B 108:541–569, 2006).This study was funded by OTKA, Hungarian
National Fund for Scientific Research, project 47340; by Mobile Innovation Centre, Budapest University of Technology, project 2.2; Optirisk Systems, Uxbridge, UK and by BRIEF (Brunel University Research Innovation and Enterprise Fund)
Management of a capital stock by Strotz's naive planner
http://dx.doi.org/10.1016/j.jedc.2007.09.01
Testing the bounds on quantum probabilities
Bounds on quantum probabilities and expectation values are derived for
experimental setups associated with Bell-type inequalities. In analogy to the
classical bounds, the quantum limits are experimentally testable and therefore
serve as criteria for the validity of quantum mechanics.Comment: 9 pages, Revte
Recognizing Members of the Tournament Equilibrium Set is NP-hard
A recurring theme in the mathematical social sciences is how to select the
"most desirable" elements given a binary dominance relation on a set of
alternatives. Schwartz's tournament equilibrium set (TEQ) ranks among the most
intriguing, but also among the most enigmatic, tournament solutions that have
been proposed so far in this context. Due to its unwieldy recursive definition,
little is known about TEQ. In particular, its monotonicity remains an open
problem up to date. Yet, if TEQ were to satisfy monotonicity, it would be a
very attractive tournament solution concept refining both the Banks set and
Dutta's minimal covering set. We show that the problem of deciding whether a
given alternative is contained in TEQ is NP-hard.Comment: 9 pages, 3 figure
Behavioral implications of shortlisting procedures
We consider two-stage “shortlisting procedures” in which the menu of alternatives is first pruned by some process or criterion and then a binary relation is maximized. Given a particular first-stage process, our main result supplies a necessary and sufficient condition for choice data to be consistent with a procedure in the designated class. This result applies to any class of procedures with a certain lattice structure, including the cases of “consideration filters,” “satisficing with salience effects,” and “rational shortlist methods.” The theory avoids background assumptions made for mathematical convenience; in this and other respects following Richter’s classical analysis of preference-maximizing choice in the absence of shortlisting
- …
