869 research outputs found

    BC Bootstrap Confidence Intervals for Random Effects Panel Data Models

    Get PDF
    We study the application of bootstrap procedures to the problem of constructing confidence intervals for the coefficients of random effects panel data models, based on GLS point estimation. The central problem is the one of adequately resampling from the estimated residuals of the model, avoiding violations of the structural features of the random shocks.

    Statistical Calibration: a simplification of Foster's Prof

    Get PDF
    Consider the following problem: at each date in the future, a given event may or may not occur, and you will be asked to forecast, at each date, the probability that the event will occur in the next date. Unless you make degenerate forecasts (zero or one), the fact that the event does or does not occur does not prove your forecast wrong. But, in the long run, if your forecasts are accurate, the conditional relative frequencies of occurrence of the event should approach your forecast. [4] has presented an algorithm that, whatever the sequence of realizations of the event, will meet the long-run accuracy criterion, even though it is completely ignorant about the real probabilities of occurrence of the event, or about the reasons why the event occurs or fails to occur. It is an adaptive algorithm, that reacts to the history of forecasts and occurrences, but does not learn from the history anything about the future: indeed, the past need not say anything about the future realizations of the event. The algorithm only looks at its own past inaccuracies and tries to make up for them in the future. The amazing result is that this (making up for past inaccuracies) can be done with arbitrarily high probability! Alternative arguments for this result have been proposed in the literature, remarkably by [3], where a very simple algorithm has been proved to work, using a classical result in game theory: Blackwell´s approachability result, [1]. Very recently, [2] has especialized Blackwell´s theorem in a way that (under a minor modification of the algorithm) simplifies the argument of [3]. Here I present such modification and argument.

    Incentive-compatible Fiscal Constitutions

    Get PDF
    It is a fact that in most of the countries (both currently and historically) the wealthier a person is, the higher the amount that he or she has to pay in taxes. This feature of a tax system was actually considered by Adam Smith as the first maxim of a good tax regime. In the Wealth of Nations (book 5, chapter 2, part 2), Smith stated this equality principle as: “The subjects of every state ought to contribute towards the support of the government... in proportion to the revenue which they respectively enjoy under the protection of the state.” On the other hand, the literature on the economy of the state proposes that: • One important reason for the existence of states is that the society needs an agent who can protect it from foreign predation. • A good characterization of the behavior of the state must treat it as a private enterprise, interested in maximizing the rents of the ruling elite. • The focus of the research of constitutional political economists should be the design of Constitutions that provide the state with the incentives to do the job that it is supposed to. In this paper, we attempt to associate all the previous ideas. We analyze the problem in a very simple Principal-Agent framework. Under imperfect information, we assume that the problem of the society is precisely to design a fiscal constitution such that the state has the incentives to protect the wealth of the society. The interesting result is that, under mild assumptions, such incentive compatible constitution implies a strictly increasing dependance of taxes on the wealth of the tax payer. As a by-product, we also find that such constitution cannot be Pareto efficient. The paper is organized as follows. In the next section, some related literature is reviewed. Then, the next two sections set up and solve the problem. As a comparison benchmark, a Pareto efficient constitution is derived. Those are the two most important parts of the paper. The model has assumptions that range from purely technical to strong economic simplifications. For that reason, before stating the concluding remarks we discuss the assumptions and main weaknesses of the paper. An appendix includes a related problem and the proof of one of the results.

    What does fairness imply?

    Get PDF
    Decades back, a most prominent justice philosopher, John Rawls, put forth a clear definition of fairness in problems of social choice. Decision theory, which studies individual, and not social, choice has provided axiomatizations of decision rules in many settings, most prominently in settings where individuals face uncertainty (and not just risk). It turns out that there exists an analytical connection between these two branches of thought. This note exploits the aforementioned connection, by reading the social choice problem in terms of decision theory and (partially) exploiting the existing axiomatization. The purpose of the note is to obtain new and interesting questions more than it is to answer them, so it concludes by proposing a research problem.

    Individually Rational Colective Choice Under Random Preferences

    Get PDF
    In this paper I consider the following problem: There is a collection of, exogenously given, socially feasible sets, and for each one of them, each one of a group of individuals chooses from an individually feasible set. The fact that the product of the individually feasible sets is larger than the socially feasible set notwithstanding, there arises no conflict between individuals. Assuming that individual preferences are random, I here characterize collective choices in terms of the way in wich individual preferences must co-vary in orden to explain them. I do this by combining standard revealed preference theory and its counterpart under random preferences.I also argue that there exist collective choices that cannon be rationalized, and hence that the individual rationality assumption can be refuted.Revealed preference, random utility; collective choice; individual rationality.

    Statistical calibration: a simplification of Foster’s proof

    Get PDF
    Foster (1999) has given a proof of the Calibration Theorem of Foster and Vohra (1998), using the Approachability Theorem proposed by Blackwell (1956). This note presents a simplified version of Foster’s argument, invoking the specialization given by Greenwald et al. (2006) of Blackwell’s Theorem

    BELIEF NON-EQUIVALENCE AND FINANCIAL TRADE: A COMMENT ON A RESULT BY ARAUJO AND SANDRONI

    Get PDF
    Aloisio Araujo and Alvaro Sandroni have shown in [1] that in a complete-markets economy in which there are no exogenous bounds to financial trade, existence of equilibrium requires agents with prior beliefs that agree on zero-probability events, and, therefore, with asymptotically homogeneous posteriors. This note illustrates the extent to which the result depends on market completeness: in general, equilibrium requires compatibility of beliefs only up to the revenue transfer opportunities allowed by the market; when the market is sufficiently incomplete, generically on the space of asset returns, even individuals who disagree on zero-probability events meet that constrained-compatibility" requirement."General equilibrium, heterogeneous beliefs, existence

    THE IDENTIFICATION OF PREFERENCES FROM MARKET DATA UNDER UNCERTAINTY

    Get PDF
    We show that even under incomplete markets, the equilibrium manifold identifies aggregate demand and individual demands everywhere in their domains. Moreover, under partial observation of the equilibrium manifold, we we construct maximal domains of identification. For this, we assume conditions of smoothness, interiority and regularity, but avoid implausible observational requirements. It is crucial that there be date-zero consumption. As a by-product, we develop some duality theory under incomplete markets.Identification

    A nonparametric analysis of the Cournot model

    Get PDF
    An observer makes a number of observations of an industry producing a homogeneous good. Each observation consists of the market price, the output of individual firms and perhaps information on each firm's production cost. We provide various tests (typically, linear programs) with which the observer can determine if the data set is consistent with the hypothesis that firms in this industry are playing a Cournot game at each observation. When cost information is wholly or partially unavailable, these tests could potentially be used to derive cost information on the firms. This paper is a contribution to the literature that aims to characterize (in various contexts) the restrictions that a data set must satisfy for it to be consistent with Nash outcomes in a game. It is also inspired by the seminal result of Afriat (and the subsequent literature) which addresses similar issues in the context of consumer demand, though one important technical difference from most of these results is that the objective functions of firms in a Cournot game are not necessarily quasiconcave
    corecore