272,804 research outputs found

    Using a Microeconometric Model of Household Labour Supply to Design Optimal Income Taxes

    Get PDF
    The purpose of this paper is to present an exercise where we identify optimal income tax rules according to various social welfare criteria, keeping fixed the total net tax revenue. Empirical applications of optimal taxation theory have typically adopted analytical expressions for the optimal taxes and then imputed numerical values to their parameters by using "calibration" procedures or previous econometric estimates. Besides the restrictiveness of the assumptions needed to obtain analytical solutions to the optimal taxation problem, a shortcoming of that procedure is the possible inconsistency between the theoretical assumptions and the assumptions implicit in the empirical evidence. In this paper we follow a different procedure, based on a computational approach to the optimal taxation problem. To this end, we estimate a microeconomic model with 78 parameters that capture heterogeneity in consumption-leisure preferences for singles and couples as well as in job opportunities across individuals based on detailed Norwegian household data for 1994. For any given tax rule, the estimated model can be used to simulate the labour supply choices made by single individuals and couples. Those choices are therefore generated by preferences and opportunities that vary across the decision units. We then identify optimal tax rules – within a class of 9-parameter piece-wise linear rules - by iteratively running the model until a given social welfare function attains its maximum under the constraint of keeping constant the total net tax revenue. The parameters to be determined are an exemption level, four marginal tax rates, three "kink points" and a lump sum transfer that can be positive (benefit) or negative (tax). We explore a variety of social welfare functions with differing degree of inequality aversion. All the social welfare functions imply monotonically increasing marginal tax rates. When compared with the current (1994) tax systems, the optimal rules imply a lower average tax rate. Moreover, all the optimal rules imply – with respect to the current rule – lower marginal rates on low and/or average income levels and higher marginal rates on relatively high income levels. These results are partially at odds with the tax reforms that took place in many countries during the last decades. While those reforms embodied the idea of lowering average tax rates, the way to implement it has typically consisted in reducing the top marginal rates. Our results instead suggest to lower average tax rates by reducing marginal rates on low and average income levels and increasing marginal rates on very high income levels.Labour supply; optimal taxation; random utility model; microsimulation

    Effects of rotation scheme on fishing behaviour with price discrimination and limited durability: Theory and evidence.

    Get PDF
    This paper examines how rotation arrangement between two groups of fishers with different institutional arrangements affects fishing behaviour and economic outcomes in a particular economic environment characterised by price discrimination and product durability. In one group, fishers cooperate and maximise the extraction of rents, while members in the second group behave non-cooperatively. Applying a model of alternating duopoly, we show that the cooperating group behaves like a price discriminating monopolist and tends to uphold prices. When the two groups rotate fishing days the cooperating group tends to produce more, which prevents the non-cooperating group from unprofitable demand pre-emption

    HypTrails: A Bayesian Approach for Comparing Hypotheses About Human Trails on the Web

    Full text link
    When users interact with the Web today, they leave sequential digital trails on a massive scale. Examples of such human trails include Web navigation, sequences of online restaurant reviews, or online music play lists. Understanding the factors that drive the production of these trails can be useful for e.g., improving underlying network structures, predicting user clicks or enhancing recommendations. In this work, we present a general approach called HypTrails for comparing a set of hypotheses about human trails on the Web, where hypotheses represent beliefs about transitions between states. Our approach utilizes Markov chain models with Bayesian inference. The main idea is to incorporate hypotheses as informative Dirichlet priors and to leverage the sensitivity of Bayes factors on the prior for comparing hypotheses with each other. For eliciting Dirichlet priors from hypotheses, we present an adaption of the so-called (trial) roulette method. We demonstrate the general mechanics and applicability of HypTrails by performing experiments with (i) synthetic trails for which we control the mechanisms that have produced them and (ii) empirical trails stemming from different domains including website navigation, business reviews and online music played. Our work expands the repertoire of methods available for studying human trails on the Web.Comment: Published in the proceedings of WWW'1

    Updating beliefs with incomplete observations

    Get PDF
    Currently, there is renewed interest in the problem, raised by Shafer in 1985, of updating probabilities when observations are incomplete. This is a fundamental problem in general, and of particular interest for Bayesian networks. Recently, Grunwald and Halpern have shown that commonly used updating strategies fail in this case, except under very special assumptions. In this paper we propose a new method for updating probabilities with incomplete observations. Our approach is deliberately conservative: we make no assumptions about the so-called incompleteness mechanism that associates complete with incomplete observations. We model our ignorance about this mechanism by a vacuous lower prevision, a tool from the theory of imprecise probabilities, and we use only coherence arguments to turn prior into posterior probabilities. In general, this new approach to updating produces lower and upper posterior probabilities and expectations, as well as partially determinate decisions. This is a logical consequence of the existing ignorance about the incompleteness mechanism. We apply the new approach to the problem of classification of new evidence in probabilistic expert systems, where it leads to a new, so-called conservative updating rule. In the special case of Bayesian networks constructed using expert knowledge, we provide an exact algorithm for classification based on our updating rule, which has linear-time complexity for a class of networks wider than polytrees. This result is then extended to the more general framework of credal networks, where computations are often much harder than with Bayesian nets. Using an example, we show that our rule appears to provide a solid basis for reliable updating with incomplete observations, when no strong assumptions about the incompleteness mechanism are justified.Comment: Replaced with extended versio
    • …
    corecore