25 research outputs found

    Inequality and income gaps

    Get PDF
    This paper discusses inequality orderings based explicitly on closing up of income gaps, demonstrating the links between these and other orderings, the classes of functions preserving the orderings and applications showing their usefulness in comparison of economic policies

    Inequality and income gaps

    Get PDF
    This paper discusses inequality orderings based explicitly on closing up of income gaps, demonstrating the links between these and other orderings, the classes of functions preserving the orderings and applications showing their usefulness in comparison of economic policies.Inequality, income distribution

    Elitism and Stochastic Dominance

    Get PDF
    Stochastic dominance has been typically used with a special emphasis on risk and in-equality reduction something captured by the concavity of the utility function in the expected utility model. We claim that the applicability of the stochastic dominance ap-proach goes far beyond risk and inequality measurement provided suitable adaptations be made. We apply in the paper the stochastic dominance approach to the measurement of elitism which may be considered the opposite of egalitarianism. While the usual stochastic dominance quasi-orderings attach more value to more equal and more effi-cient distributions, our criteria ensure that, the more unequal and the more efficient the distribution, the higher it is ranked. Two instances are provided by (i) comparisons of scientific performance across institutions like universities or departments, and (ii) com-parisons of affluence as opposed to poverty between countries.Decumulative Distribution Functions, Stochastic Dominance, Regressive Transfers, Elitism, Scientific Performance, Affluence

    Elitism and Stochastic Dominance

    Get PDF
    Stochastic dominance has typically been used with a special emphasis on risk and inequality reduction something captured by the concavity of the utility function in the expected utility model. We claim that the applicability of the stochastic dominance approach goes far beyond risk and inequality measurement provided suitable adpations be made. We apply in the paper the stochastic dominance approach to the measurment of elitism which may be considered the opposite of egalitarianism. While the usual stochastic dominance quasi-orderings attach more value to more equal and more efficient distributions, our criteria ensure that the more unequal and the more the efficient the distribution, the higher it is ranked. two instances are provided by (i) comparisons of scientific performance across institutions like universities or departments and (ii) comparisons of affluence as opposed to poverty across countries.Decumulative distribution functions; Stochastic dominance; Regressive transfers; Elitism; Scientific Performance; Affluence

    Sequential Comparisons of Generalized Lorenz Curves for Different Demographics

    Get PDF
    Jenkins and Lambert (1993) and Chambaz and Maurin (1998) proposed extensions of the sequential generalized Lorenz dominance (SGL) criterion that was proposed by Atkinson and Bourguignon (1987): the extended version of SGL made it possible to compare distributions for different demographics. However, the tests to check the extended SGL “are not expressible in terms of generalized Lorenz curves” (Lambert 2001, p.79). In this paper, we show that the dominance condition can be easily checked by sequential comparisons of a modified version of the generalized Lorenz curve. We apply this procedure by comparing the income distributions of Italian households using data obtained from the Survey on Household Income and Wealth (SHIW, Bank of Italy) from 2006 and 2012

    Empirical Issues in Lifetime Poverty Measurement

    Get PDF
    This paper demonstrates the implications of adopting an approach to measuring poverty that takes into account the lifetime experience of individuals rather than simply taking a static or cross-sectional perspective. Our approach follows the theoretical innovations in Hoy and Zheng (2008) which address various aspects of the specific pattern of any poverty spells experienced by an individual as well as a possible retrospective consideration that an individual might have concerning his life experience as a whole. For an individual, our perspective of lifetime poverty is influenced by both the snapshot poverty of each period and the poverty level of the permanent lifetime consumption; it is also influenced by how poverty spells are distributed over the lifetime. Using PSID data for the US, we demonstrate empirically the power of alternative axioms concerning how lifetime poverty should be measured when making pairwise comparisons of individual lifetime profiles of consumption (income) experiences. We also demonstrate the importance of taking a lifetime view of poverty in comparing poverty between groups by use of the classic FGT ‘snapshot’ poverty index in conjunction with period weighting functions that explicitly reflect concerns about the pattern of poverty spells over individuals’ lifetimes.Lifetime poverty, snapshot poverty, chronic poverty, early poverty, poverty measurement

    Fairness and Direct Democracy

    Get PDF
    The median voter model (direct democracy) has wide applicability, but it is based on selfish voters i.e. voters who derive utility solely from 'own' payoff. The recent literature has pointed to fairness and concern for others as basic human motives that explain a range of economic phenomena. We examine the implications of introducing fair voters who have a preference for fairness as in Fehr and Schmidt (1999). Within a simple general equilibrium model, we demonstrate the existence of a Condorcet winner for fair voters using the single crossing property of voters’ preferences. In a fair voter model, unlike a selfish voter model, poverty can lead to increased redistribution. Mean preserving spreads of income increase equilibrium redistribution. Greater fairness leads to greater redistribution. The introduction of selfish voters in an economy where the median voter is fair can have a large impact on the redistributive outcome. An empirical exercise using OECD data illustrates the potential importance of fairness in explaining redistribution.Redistribution; other regarding preferences; single crossing property; income inequality; American Exceptionalism

    A Robust Multi-Dimensional Poverty Profile for Uganda

    Get PDF
    In this paper we compute a multi-dimensional poverty index (MPI) for Uganda following the approach proposed by Alkire and Forster (2007). Using household survey data we show how the incidence of multi-dimensional poverty has fallen in recent years and we use the decomposability features of the index to explain the drivers of reduction in multi-dimensional poverty. We also compare the results from Uganda with other countries for which the MPI has been computed and we note some caveats in such a comparison. The robustness of our estimates is tested in a stochastic dominance framework and using statistical inference. Notably, we extend the one-dimensional analysis of stochastic dominance to take into account household size in a second dimension, which is particularly important as some of the MPI indicators are sensitive to the number of household members. By exploiting a unique subsample of the integrated household survey programme in Uganda, which has not previously been analysed, we are also able to match the data-set used for the MPI with data used to compute the conventional estimates of monetary poverty. This enables a more robust assessment of the complementarities of the two types of poverty measures than has been previously possible.multidimensional poverty, counting approach, Uganda, household size, robustness analysis, international comparisons.

    Equity by the Numbers: Measuring Poverty, Inequality, and Injustice

    Get PDF
    Can we measure inequity? Can we arrive at a number or numbers capturing the extent to which a given society is equitable or inequitable? Sometimes such questions are answered with a “no”: equity is a qualitative, non-numerical consideration. This Article offers a different perspective. The difficulty with equity measurement is not the impossibility of quantification, but the overabundance of possible metrics. There currently exist at least four families of equity-measurement frameworks, used by scholars and, to some extent, governments: inequality metrics (such as the Gini coefficient), poverty metrics, social-gradient metrics (such as the concentration index), and equity-regarding social welfare functions. Inequality metrics look at the population-wide distribution of income or some other valuable good. Poverty metrics focus on the extent to which individuals fall below an income threshold, or are deprived in other ways. Social-gradient metrics quantify the correlation between goods (in particular, good health) and social status. Equity-regarding social welfare functions seek to translate each individual’s bundle of attributes into a “utility” number, measuring her well-being; and then rank situations so as to give extra weight to improvements in the utility of those at a low utility level. So which of these is the most attractive equity metric? How shall we choose among them? This Article tries to make progress on these questions. I show how all four families share a deeper unity: all satisfy the “Pigou-Dalton” principle in some form, favoring a non-leaky transfer of some valuable “currency” from those with more to those with less. The metrics differ in the specific version of the Pigou-Dalton principle that they embody: how they identify the valuable “currency,” and whether they restrict the scope of that principle. Thus, in choosing among equity metrics, we should be guided by the lodestar question: what are the grounds for specifying the Pigou-Dalton principle in this way rather than that? This question, in turn, helps us see how the diversity of current equity metrics flows from intellectual contestation about the nature and measurement of well-being, and about the conditions under which individuals are responsible for being badly off

    Assortment and Pricing Optimisation under non-conventional customer choice models

    Get PDF
    Nowadays, extensive research is being done in the area of revenue management, with applications across industries. In the center of this area lays the assortment problem, which amounts to find a subset of products to offer in order to maximise revenue, provided that customers follow a certain model of choice. Most studied models satisfy the following property: whenever the offered set is enlarged, then the probability of selecting a specific product decreases. This property is called regularity in the literature. However, customer behaviour often shows violations of this condition such as the decoy effect, where adding extra options sometimes leads to a positive effect for some products, whose probabilities of being selected increase relative to other products (e.g., including a medium size popcorn slightly cheaper than the large one, with the purpose of making the latter more attractive by comparison). We study two models of customer choice where regularity violations can be accommodated (hence the non-conventionality), and show that the assortment optimisation problem can still be solved in polynomial time. First we analyse the Sequential Multinomial Logit (SML). Under the SML model, products are partitioned into two levels, to capture differences in attractiveness, brand awareness and, or visibility of the products in the market. When a consumer is presented with an assortment of products, she first considers products on the first level and, if none of them is purchased, products in the second level are considered. This model is a special case of the Perception-Adjusted Luce Model (PALM) recently proposed by Echenique et al.(2018). It can explain many behavioural phenomena such as the attraction, compromise, similarity effects and choice overload which cannot be explained by the Multinomial Logit (MNL) model or any discrete choice model based on random utility. We show that the concept of revenue-ordered assortment sets, which contain an optimal assortment under the MNL model, can be generalized to the SML model. More precisely, we show that all optimal assortments under the SML are revenue-ordered by level, a natural generalization of revenue-ordered assortments that contains, at most, a quadratic number of assortments. As a corollary, assortment optimization under the SML is polynomial-time solvable Secondly, the Two-Stage Luce model (2SLM), is a discrete choice model introduced by Echenique and Saito (2018) that generalizes the standard multinomial logit model (MNL). The 2SLM does not satisfy the Independence of Irrelevant Alternatives (IIA) property nor regularity, and to model customer behaviour, each product has an intrinsic utility, and uses a dominance relation between products. Given a proposed assortment S, consumers first discard all dominated products in S before using an MNL model on the remaining products. As a result, the model can capture behaviour that cannot be replicated by any discrete choice model based on random utilities. We show that the assortment problem under the 2SLM is polynomially-solvable. Moreover, we prove that the capacitated assortment optimization problem is NP-hard and present polynomial-time algorithms for the cases where (1) the dominance relation is attractiveness correlated and (2) its transitive reduction is a forest. The proofs exploit a strong connection between assortments under the 2SLM and independent sets in comparability graphs. The third and final contribution is an in-depth study of the pricing problem under the 2SLM. We first note that changes in prices should be reflected in the dominance relation if the differences between the resulting attractiveness are large enough. This is formalised by solving the joint assortment and pricing problem under the Threshold Luce model, where one product dominates another if the ratio between their attractiveness is greater than a fixed threshold. In this setting, we show that this problem can be solved in polynomial time
    corecore