149 research outputs found

    Sovereign Net Worth: An Analytical Framework

    Get PDF
    The Fiscal Responsibility Act requires the Crown to articulate targets for a series of fiscal variables, including net worth. Given the dramatic improvement in the fiscal position in recent years, a critical policy question relates to how (and which) measures of Crown net worth should be targeted. This paper sets out a framework for targeting Crown net worth. It does so by supplementing the GAAP-based measure with forward-looking information about spending and tax revenue. The paper argues that targeting net worth for the Crown requires the estimation of a path, rather than a static level.

    Fiscal Transparency, Gubernatorial Popularity, and the Scale of Government: Evidence from the States.

    Get PDF
    We explore the effect of transparency of fiscal institutions on the scale of government and gubernatorial popularity using a formal model of accountability. We construct an index of fiscal transparency for the American states from detailed budgetary information. With cross-section data for 1986-1995, we find that - on average and controlling for other influential factors - fiscal transparency increases both the scale of government and gubernatorial popularity. The results, subjected to extensive robustness checks, imply that more transparent budget institutions induce greater effort by politicians, to which voters give higher job approval, on average. Voters also respond by entrusting greater resources to politicians where insittutions are more transparent, leading to larger size of government.

    Nested sampling for Potts models

    Get PDF
    Nested sampling is a new Monte Carlo method by Skilling [1] intended for general Bayesian computation. Nested sampling provides a robust alternative to annealing-based methods for computing normalizing constants. It can also generate estimates of other quantities such as posterior expectations. The key technical requirement is an ability to draw samples uniformly from the prior subject to a constraint on the likelihood. We provide a demonstration with the Potts model, an undirected graphical model

    Model selection in cosmology

    Get PDF
    Model selection aims to determine which theoretical models are most plausible given some data, without necessarily considering preferred values of model parameters. A common model selection question is to ask when new data require introduction of an additional parameter, describing a newly discovered physical effect. We review model selection statistics, then focus on the Bayesian evidence, which implements Bayesian analysis at the level of models rather than parameters. We describe our CosmoNest code, the first computationally efficient implementation of Bayesian model selection in a cosmological context. We apply it to recent WMAP satellite data, examining the need for a perturbation spectral index differing from the scaleinvariant (Harrison–Zel'dovich) case

    Application of Bayesian model averaging to measurements of the primordial power spectrum

    Get PDF
    Cosmological parameter uncertainties are often stated assuming a particular model, neglecting the model uncertainty, even when Bayesian model selection is unable to identify a conclusive best model. Bayesian model averaging is a method for assessing parameter uncertainties in situations where there is also uncertainty in the underlying model. We apply model averaging to the estimation of the parameters associated with the primordial power spectra of curvature and tensor perturbations. We use CosmoNest and MultiNest to compute the model Evidences and posteriors, using cosmic microwave data from WMAP, ACBAR, BOOMERanG and CBI, plus large-scale structure data from the SDSS DR7. We find that the model-averaged 95% credible interval for the spectral index using all of the data is 0.940 < n_s < 1.000, where n_s is specified at a pivot scale 0.015 Mpc^{-1}. For the tensors model averaging can tighten the credible upper limit, depending on prior assumptions.Comment: 7 pages with 7 figures include

    Wetland mitigation banking in Tennessee: an assessment of three banks performance

    Get PDF
    Each year, the nation as a whole loses nearly 58,000 acres of wetlands. The national goal of both the EPA and the Army Corps of Engineers is to achieve no net loss of wetlands. Wetlands law allows for mitigation of wetland impacts with the implied assumption that the mitigation will adequately compensate for the wetland loss. Mitigation banks are becoming a popular mitigation option. Mitigation banks, due to their large size and ample mitigation ratios, make a significant contribution to the national goal of no net loss. This study evaluates three mitigation banks currently operating in Tennessee. The evaluation is based on the success criteria established in the Memorandums of Agreement specific to each bank in order to determine if each bank is successfully achieving a no net loss of wetlands through the mitigation it provides. This study found that wetland mitigation banking in Tennessee is achieving the national goal of no net loss through generous mitigation ratios and well planned restoration. Despite the current level of success, the mitigation banking process in Tennessee needs changes that would make it even more successful and efficient. Monitoring reports need to be completed and submitted in accordance with firm timetables. Additionally, regulators should establish clearly defined standards for the material content of all monitoring reports. The critical element for monitoring reports should be consistency of content and quality. These changes, along with the establishment of smaller, more regional mitigation banks would greatly improve mitigation banking in Tennessee

    Optimizing future dark energy surveys for model selection goals

    Full text link
    We demonstrate a methodology for optimizing the ability of future dark energy surveys to answer model selection questions, such as `Is acceleration due to a cosmological constant or a dynamical dark energy model?'. Model selection Figures of Merit are defined, exploiting the Bayes factor, and surveys optimized over their design parameter space via a Monte Carlo method. As a specific example we apply our methods to generic multi-fibre baryon acoustic oscillation spectroscopic surveys, comparable to that proposed for SuMIRe PFS, and present implementations based on the Savage-Dickey Density Ratio that are both accurate and practical for use in optimization. It is shown that whilst the optimal surveys using model selection agree with those found using the Dark Energy Task Force (DETF) Figure of Merit, they provide better informed flexibility of survey configuration and an absolute scale for performance; for example, we find survey configurations with close to optimal model selection performance despite their corresponding DETF Figure of Merit being at only 50% of its maximum. This Bayes factor approach allows us to interpret the survey configurations that will be good enough for the task at hand, vital especially when wanting to add extra science goals and in dealing with time restrictions or multiple probes within the same project.Comment: 12 pages, 16 figure

    Bayesian Methods for Exoplanet Science

    Full text link
    Exoplanet research is carried out at the limits of the capabilities of current telescopes and instruments. The studied signals are weak, and often embedded in complex systematics from instrumental, telluric, and astrophysical sources. Combining repeated observations of periodic events, simultaneous observations with multiple telescopes, different observation techniques, and existing information from theory and prior research can help to disentangle the systematics from the planetary signals, and offers synergistic advantages over analysing observations separately. Bayesian inference provides a self-consistent statistical framework that addresses both the necessity for complex systematics models, and the need to combine prior information and heterogeneous observations. This chapter offers a brief introduction to Bayesian inference in the context of exoplanet research, with focus on time series analysis, and finishes with an overview of a set of freely available programming libraries.Comment: Invited revie

    Model selection and parameter estimation in structural dynamics using approximate Bayesian computation

    Get PDF
    This paper will introduce the use of the approximate Bayesian computation (ABC) algorithm for model selection and parameter estimation in structural dynamics. ABC is a likelihood-free method typically used when the likelihood function is either intractable or cannot be approached in a closed form. To circumvent the evaluation of the likelihood function, simulation from a forward model is at the core of the ABC algorithm. The algorithm offers the possibility to use different metrics and summary statistics representative of the data to carry out Bayesian inference. The efficacy of the algorithm in structural dynamics is demonstrated through three different illustrative examples of nonlinear system identification: cubic and cubic-quintic models, the Bouc-Wen model and the Duffing oscillator. The obtained results suggest that ABC is a promising alternative to deal with model selection and parameter estimation issues, specifically for systems with complex behaviours

    Microwave observations of spinning dust emission in NGC6946

    Full text link
    We report new cm-wave measurements at five frequencies between 15 and 18GHz of the continuum emission from the reportedly anomalous "region 4" of the nearby galaxy NGC6946. We find that the emission in this frequency range is significantly in excess of that measured at 8.5GHz, but has a spectrum from 15-18GHz consistent with optically thin free-free emission from a compact HII region. In combination with previously published data we fit four emission models containing different continuum components using the Bayesian spectrum analysis package radiospec. These fits show that, in combination with data at other frequencies, a model with a spinning dust component is slightly preferred to those that possess better-established emission mechanisms.Comment: submitted MNRA
    corecore