697 research outputs found

    Model Uncertainty and Policy Evaluation: Some Theory and Empirics

    Get PDF
    This paper explores ways to integrate model uncertainty into policy evaluation. We first describe a general framework for the incorporation of model uncertainty into standard econometric calculations. This framework employs Bayesian model averaging methods that have begun to appear in a range of economic studies. Second, we illustrate these general ideas in the context of assessment of simple monetary policy rules for some standard New Keynesian specifications. The specifications vary in their treatment of expectations as well as in the dynamics of output and inflation. We conclude that the Taylor rule has good robustness properties, but may reasonably be challenged in overall quality with respect to stabilization by alternative simple rules that also condition on lagged interest rates, even though these rules employ parameters that are set without accounting for model uncertainty.

    Policy Evaluation in Uncertain Economic Environments

    Get PDF
    This paper develops a decision-theoretic approach to policy analysis. We argue that policy evaluation should be conducted on the basis of two factors: the policymaker's preferences, and the conditional distribution of the outcomes of interest given a policy and available information. From this perspective, the common practice of conditioning on a particular model is often inappropriate, since model uncertainty is an important element of policy evaluation. We advocate the use of model averaging to account for model uncertainty and show how it may be applied to policy evaluation exercises. We illustrate our approach with applications to monetary policy and to growth policy.

    Policy Evaluation in Uncertain Economic Environments

    Get PDF
    This paper develops a general framework for economic policy evaluation. Using ideas from statistical decision theory, it argues that conventional approaches fail to appropriately integrate econometric analysis into evaluation problems. Further, it is argued that evaluation of alternative policies should explicitly account for uncertainty about the appropriate model of the economy. The paper shows how to develop an explicitly decision-theoretic approach to policy evaluation and how to incorporate model uncertainty into such an analysis. The theoretical implications of model uncertainty are explored in a set of examples, with a specific focus on how to design policies that are robust against such uncertainty. Finally, the framework is applied to the evaluation of monetary policy rules and to the analysis of tariff reductions as a way to increase aggregate economic growth.macroeconomics, Policy Evaluation, Uncertain Economic Environments

    M87, Globular Clusters, and Galactic Winds: Issues in Giant Galaxy Formation

    Full text link
    New VRI photometry is presented for the globular clusters in the innermost 140'' of the M87 halo. The results are used to discuss several issues concerning the formation and evolution of globular cluster systems in supergiant ellipticals like M87. (1) we find no significant change in the globular cluster luminosity function (GCLF) with galactocentric radius, for cluster masses M < 10^5 solar masses, indicating that the main effects of dynamical evolution may be only on lower-mass clusters. (2) Within the core radius (1') of the globular cluster system, the metallicity distribution is uniform, but at larger radii the mean metallicity declines steadily as Z ~ r^-0.9. (3) The various options for explaining the existence of high specific frequency galaxies like M87 are evaluated, and scaling laws for the GCSs in these galaxies are given. Interpretations involving secondary evolution (formation of many globular clusters during mergers, intergalactic globular clusters, etc.) are unlikely to be the primary explanation for high-S_N galaxies. (4) We suggest that central-supergiant E galaxies may have formed in an exceptionally turbulent or high-density environment in which an early, powerful galactic wind drove out a high fraction of the protogalactic gas, thus artificially boosting the specificComment: 67 pages, 17 figures. To appear in Astronomical Journal, in press for May 1998. Preprints also available from W.Harris; send e-mail request to [email protected]

    Indoor Residual Spraying in Combination with Insecticide-Treated Nets Compared to Insecticide-Treated Nets Alone for Protection against Malaria: A Cluster Randomised Trial in Tanzania.

    Get PDF
    Insecticide-treated nets (ITNs) and indoor residual spraying (IRS) of houses provide effective malaria transmission control. There is conflicting evidence about whether it is more beneficial to provide both interventions in combination. A cluster randomised controlled trial was conducted to investigate whether the combination provides added protection compared to ITNs alone. In northwest Tanzania, 50 clusters (village areas) were randomly allocated to ITNs only or ITNs and IRS. Dwellings in the ITN+IRS arm were sprayed with two rounds of bendiocarb in 2012. Plasmodium falciparum prevalence rate (PfPR) in children 0.5-14 y old (primary outcome) and anaemia in children <5 y old (secondary outcome) were compared between study arms using three cross-sectional household surveys in 2012. Entomological inoculation rate (secondary outcome) was compared between study arms. IRS coverage was approximately 90%. ITN use ranged from 36% to 50%. In intention-to-treat analysis, mean PfPR was 13% in the ITN+IRS arm and 26% in the ITN only arm, odds ratio = 0.43 (95% CI 0.19-0.97, n = 13,146). The strongest effect was observed in the peak transmission season, 6 mo after the first IRS. Subgroup analysis showed that ITN users were additionally protected if their houses were sprayed. Mean monthly entomological inoculation rate was non-significantly lower in the ITN+IRS arm than in the ITN only arm, rate ratio = 0.17 (95% CI 0.03-1.08). This is the first randomised trial to our knowledge that reports significant added protection from combining IRS and ITNs compared to ITNs alone. The effect is likely to be attributable to IRS providing added protection to ITN users as well as compensating for inadequate ITN use. Policy makers should consider deploying IRS in combination with ITNs to control transmission if local ITN strategies on their own are insufficiently effective. Given the uncertain generalisability of these findings, it would be prudent for malaria control programmes to evaluate the cost-effectiveness of deploying the combination.\ud \u

    Deweyan tools for inquiry and the epistemological context of critical pedagogy

    Get PDF
    This article develops the notion of resistance as articulated in the literature of critical pedagogy as being both culturally sponsored and cognitively manifested. To do so, the authors draw upon John Dewey\u27s conception of tools for inquiry. Dewey provides a way to conceptualize student resistance not as a form of willful disputation, but instead as a function of socialization into cultural models of thought that actively truncate inquiry. In other words, resistance can be construed as the cognitive and emotive dimensions of the ongoing failure of institutions to provide ideas that help individuals both recognize social problems and imagine possible solutions. Focusing on Dewey\u27s epistemological framework, specifically tools for inquiry, provides a way to grasp this problem. It also affords some innovative solutions; for instance, it helps conceive of possible links between the regular curriculum and the study of specific social justice issues, a relationship that is often under-examined. The aims of critical pedagogy depend upon students developing dexterity with the conceptual tools they use to make meaning of the evidence they confront; these are background skills that the regular curriculum can be made to serve even outside social justice-focused curricula. Furthermore, the article concludes that because such inquiry involves the exploration and potential revision of students\u27 world-ordering beliefs, developing flexibility in how one thinks may be better achieved within academic subjects and topics that are not so intimately connected to students\u27 current social lives, especially where students may be directly implicated

    A Common Allele in FGF21 Associated with Sugar Intake Is Associated with Body Shape, Lower Total Body-Fat Percentage, and Higher Blood Pressure

    Get PDF
    Summary: Fibroblast growth factor 21 (FGF21) is a hormone that has insulin-sensitizing properties. Some trials of FGF21 analogs show weight loss and lipid-lowering effects. Recent studies have shown that a common allele in the FGF21 gene alters the balance of macronutrients consumed, but there was little evidence of an effect on metabolic traits. We studied a common FGF21 allele (A:rs838133) in 451,099 people from the UK Biobank study, aiming to use the human allele to inform potential adverse and beneficial effects of targeting FGF21. We replicated the association between the A allele and higher percentage carbohydrate intake. We then showed that this allele is more strongly associated with higher blood pressure and waist-hip ratio, despite an association with lower total body-fat percentage, than it is with BMI or type 2 diabetes. These human phenotypes of variation in the FGF21 gene will inform research into FGF21’s mechanisms and therapeutic potential. : Drugs targeting the hormone FGF21 may have beneficial health effects. Variations in human DNA in the FGF21 gene provide an indication of what those effects may be. Here, we show that variation in the FGF21 gene is associated with higher blood pressure and altered body shape, despite lower total body-fat percentage. Keywords: FGF21, BMI, waist-hip ratio, blood pressure, body fat, allele, genetic variant, UK Bioban

    The Baryon Oscillation Spectroscopic Survey of SDSS-III

    Get PDF
    The Baryon Oscillation Spectroscopic Survey (BOSS) is designed to measure the scale of baryon acoustic oscillations (BAO) in the clustering of matter over a larger volume than the combined efforts of all previous spectroscopic surveys of large scale structure. BOSS uses 1.5 million luminous galaxies as faint as i=19.9 over 10,000 square degrees to measure BAO to redshifts z<0.7. Observations of neutral hydrogen in the Lyman alpha forest in more than 150,000 quasar spectra (g<22) will constrain BAO over the redshift range 2.15<z<3.5. Early results from BOSS include the first detection of the large-scale three-dimensional clustering of the Lyman alpha forest and a strong detection from the Data Release 9 data set of the BAO in the clustering of massive galaxies at an effective redshift z = 0.57. We project that BOSS will yield measurements of the angular diameter distance D_A to an accuracy of 1.0% at redshifts z=0.3 and z=0.57 and measurements of H(z) to 1.8% and 1.7% at the same redshifts. Forecasts for Lyman alpha forest constraints predict a measurement of an overall dilation factor that scales the highly degenerate D_A(z) and H^{-1}(z) parameters to an accuracy of 1.9% at z~2.5 when the survey is complete. Here, we provide an overview of the selection of spectroscopic targets, planning of observations, and analysis of data and data quality of BOSS.Comment: 49 pages, 16 figures, accepted by A
    • …
    corecore