14,076 research outputs found

    Bayesian approaches to cointegration

    Get PDF
    The degree of empirical support of a priori plausible structures on the cointegration vectors has a central role in the analysis of cointegration. Villani (2000) and Strachan and van Dijk (2003) have recently proposed finite sample Bayesian procedures to calculate the posterior probability of restrictions on the cointegration space, using the existence of a uniform prior distribution on the cointegration space as the key ingredient. The current paper extends this approach to the empirically important case with different restrictions on the individual cointegration vectors. Prior distributions are proposed and posterior simulation algorithms are developed. Consumers' expenditure data for the US is used to illustrate the robustness of the results to variations in the prior. A simulation study shows that the Bayesian approach performs remarkably well in comparison to other more established methods for testing restrictions on the cointegration vectors

    How much foreign stocks? : Bayesian approaches to asset allocation can explain the home bias of US investors

    Get PDF
    US investors hold much less foreign stocks than mean/variance analysis applied to historical data predicts. In this article, we investigate whether this home bias can be explained by Bayesian approaches to international asset allocation. In contrast to mean/variance analysis, Bayesian approaches employ different techniques for obtaining the set of expected returns. They shrink sample means towards a reference point that is inferred from economic theory. We also show that one of the Bayesian approaches leads to the same implications for asset allocation as mean-variance/tracking error criterion. In both cases, the optimal portfolio is a combination the market portfolio and the mean/variance efficient portfolio with the highest Sharpe ratio. Applying the Bayesian approaches to the subject of international diversification, we find that substantial home bias can be explained when a US investor has a strong belief in the global mean/variance efficiency of the US market portfolio and when he has a high regret aversion falling behind the US market portfolio. We also find that the current level of home bias can justified whenever regret aversion is significantly higher than risk aversion. Finally, we compare the Bayesian approaches to mean/variance analysis in an empirical out-ofsample study. The Bayesian approaches prove to be superior to mean/variance optimized portfolios in terms of higher risk-adjusted performance and lower turnover. However, they not systematically outperform the US market portfolio or the minimum-variance portfolio

    Bayesian Approaches to the Precautionary Principle

    Get PDF

    Bayesian approaches to technology assessment and decision making

    Get PDF
    Until the mid-1980s, most economic analyses of healthcare technologies were based on decision theory and used decision-analytic models. The goal was to synthesize all relevant clinical and economic evidence for the purpose of assisting decision makers to efficiently allocate society's scarce resources. This was true of virtually all the early cost-effectiveness evaluations sponsored and/or published by the U.S. Congressional Office of Technology Assessment (OTA) (15), Centers of Disease Control and Prevention (CDC), the National Cancer Institute, other elements of the U.S. Public Health Service, and of healthcare technology assessors in Europe and elsewhere around the world. Methodologists routinely espoused, or at minimum assumed, that these economic analyses were based on decision theory (8;24;25). Since decision theory is rooted in—in fact, an informal application of—Bayesian statistical theory, these analysts were conducting studies to assist healthcare decision making by appealing to a Bayesian rather than a classical, or frequentist, inference approach. But their efforts were not so labeled. Oddly, the statistical training of these decision analysts was invariably classical, not Bayesian. Many were not—and still are not—conversant with Bayesian statistical approaches

    Testing the Multiverse: Bayes, Fine-Tuning and Typicality

    Full text link
    Theory testing in the physical sciences has been revolutionized in recent decades by Bayesian approaches to probability theory. Here, I will consider Bayesian approaches to theory extensions, that is, theories like inflation which aim to provide a deeper explanation for some aspect of our models (in this case, the standard model of cosmology) that seem unnatural or fine-tuned. In particular, I will consider how cosmologists can test the multiverse using observations of this universe.Comment: 19 pages, 3 figures. Conference proceedings: to appear in "The Philosophy of Cosmology", edited by Khalil Chamcham, Joseph Silk, John D. Barrow, and Simon Saunders. Cambridge University Press, 201

    Bayesian approaches to cointegratrion

    Get PDF
    The purpose of this paper is to survey and critically assess the Bayesian cointegration literature. In one sense, Bayesian analysis of cointegration is straightforward. The researcher can combine the likelihood function with a prior and do Bayesian inference with the resulting posterior. However, interesting and empirically important issues of global and local identification (and, as a result, prior elicitation) arise from the fact that the matrix of long run parameters is potentially of reduced rank. As we shall see, these identification problems can cause serious problems for Bayesian inference. For instance, a common noninformative prior can lead to a posterior distribution which is improper (i.e. is not a valid p.d.f. since it does not integrate to one) thus precluding valid statistical inference. This issue was brought forward by Kleibergen and Van Dijk (1994, 1998). The development of the Bayesian cointegration literature reflects an increasing awareness of these issues and this paper is organized to reflect this development. In particular, we begin by discussing early work, based on VAR or Vector Moving Average (VMA) representations which ignored these issues. We then proceed to a discussion of work based on the ECM representation, beginning with a simple specification using the linear normalization and normal priors before moving onto the recent literature which develops methods for sensible treatment of the identification issues
    • …
    corecore