117 research outputs found

    Sovereign Net Worth: An Analytical Framework

    Get PDF
    The Fiscal Responsibility Act requires the Crown to articulate targets for a series of fiscal variables, including net worth. Given the dramatic improvement in the fiscal position in recent years, a critical policy question relates to how (and which) measures of Crown net worth should be targeted. This paper sets out a framework for targeting Crown net worth. It does so by supplementing the GAAP-based measure with forward-looking information about spending and tax revenue. The paper argues that targeting net worth for the Crown requires the estimation of a path, rather than a static level.

    Fiscal Transparency, Gubernatorial Popularity, and the Scale of Government: Evidence from the States.

    Get PDF
    We explore the effect of transparency of fiscal institutions on the scale of government and gubernatorial popularity using a formal model of accountability. We construct an index of fiscal transparency for the American states from detailed budgetary information. With cross-section data for 1986-1995, we find that - on average and controlling for other influential factors - fiscal transparency increases both the scale of government and gubernatorial popularity. The results, subjected to extensive robustness checks, imply that more transparent budget institutions induce greater effort by politicians, to which voters give higher job approval, on average. Voters also respond by entrusting greater resources to politicians where insittutions are more transparent, leading to larger size of government.

    Nested sampling for Potts models

    Get PDF
    Nested sampling is a new Monte Carlo method by Skilling [1] intended for general Bayesian computation. Nested sampling provides a robust alternative to annealing-based methods for computing normalizing constants. It can also generate estimates of other quantities such as posterior expectations. The key technical requirement is an ability to draw samples uniformly from the prior subject to a constraint on the likelihood. We provide a demonstration with the Potts model, an undirected graphical model

    Model selection in cosmology

    Get PDF
    Model selection aims to determine which theoretical models are most plausible given some data, without necessarily considering preferred values of model parameters. A common model selection question is to ask when new data require introduction of an additional parameter, describing a newly discovered physical effect. We review model selection statistics, then focus on the Bayesian evidence, which implements Bayesian analysis at the level of models rather than parameters. We describe our CosmoNest code, the first computationally efficient implementation of Bayesian model selection in a cosmological context. We apply it to recent WMAP satellite data, examining the need for a perturbation spectral index differing from the scaleinvariant (Harrison–Zel'dovich) case

    Application of Bayesian model averaging to measurements of the primordial power spectrum

    Get PDF
    Cosmological parameter uncertainties are often stated assuming a particular model, neglecting the model uncertainty, even when Bayesian model selection is unable to identify a conclusive best model. Bayesian model averaging is a method for assessing parameter uncertainties in situations where there is also uncertainty in the underlying model. We apply model averaging to the estimation of the parameters associated with the primordial power spectra of curvature and tensor perturbations. We use CosmoNest and MultiNest to compute the model Evidences and posteriors, using cosmic microwave data from WMAP, ACBAR, BOOMERanG and CBI, plus large-scale structure data from the SDSS DR7. We find that the model-averaged 95% credible interval for the spectral index using all of the data is 0.940 < n_s < 1.000, where n_s is specified at a pivot scale 0.015 Mpc^{-1}. For the tensors model averaging can tighten the credible upper limit, depending on prior assumptions.Comment: 7 pages with 7 figures include

    Optimizing future dark energy surveys for model selection goals

    Full text link
    We demonstrate a methodology for optimizing the ability of future dark energy surveys to answer model selection questions, such as `Is acceleration due to a cosmological constant or a dynamical dark energy model?'. Model selection Figures of Merit are defined, exploiting the Bayes factor, and surveys optimized over their design parameter space via a Monte Carlo method. As a specific example we apply our methods to generic multi-fibre baryon acoustic oscillation spectroscopic surveys, comparable to that proposed for SuMIRe PFS, and present implementations based on the Savage-Dickey Density Ratio that are both accurate and practical for use in optimization. It is shown that whilst the optimal surveys using model selection agree with those found using the Dark Energy Task Force (DETF) Figure of Merit, they provide better informed flexibility of survey configuration and an absolute scale for performance; for example, we find survey configurations with close to optimal model selection performance despite their corresponding DETF Figure of Merit being at only 50% of its maximum. This Bayes factor approach allows us to interpret the survey configurations that will be good enough for the task at hand, vital especially when wanting to add extra science goals and in dealing with time restrictions or multiple probes within the same project.Comment: 12 pages, 16 figure

    Bayesian Methods for Exoplanet Science

    Full text link
    Exoplanet research is carried out at the limits of the capabilities of current telescopes and instruments. The studied signals are weak, and often embedded in complex systematics from instrumental, telluric, and astrophysical sources. Combining repeated observations of periodic events, simultaneous observations with multiple telescopes, different observation techniques, and existing information from theory and prior research can help to disentangle the systematics from the planetary signals, and offers synergistic advantages over analysing observations separately. Bayesian inference provides a self-consistent statistical framework that addresses both the necessity for complex systematics models, and the need to combine prior information and heterogeneous observations. This chapter offers a brief introduction to Bayesian inference in the context of exoplanet research, with focus on time series analysis, and finishes with an overview of a set of freely available programming libraries.Comment: Invited revie

    Model selection and parameter estimation in structural dynamics using approximate Bayesian computation

    Get PDF
    This paper will introduce the use of the approximate Bayesian computation (ABC) algorithm for model selection and parameter estimation in structural dynamics. ABC is a likelihood-free method typically used when the likelihood function is either intractable or cannot be approached in a closed form. To circumvent the evaluation of the likelihood function, simulation from a forward model is at the core of the ABC algorithm. The algorithm offers the possibility to use different metrics and summary statistics representative of the data to carry out Bayesian inference. The efficacy of the algorithm in structural dynamics is demonstrated through three different illustrative examples of nonlinear system identification: cubic and cubic-quintic models, the Bouc-Wen model and the Duffing oscillator. The obtained results suggest that ABC is a promising alternative to deal with model selection and parameter estimation issues, specifically for systems with complex behaviours

    The SWELLS survey. III. Disfavouring "heavy" initial mass functions for spiral lens galaxies

    Get PDF
    We present gravitational lens models for 20 strong gravitational lens systems observed as part of the Sloan WFC Edge-on Late-type Lens Survey (SWELLS) project. Fifteen of the lenses are taken from paper I while five are newly discovered systems. The systems are galaxy-galaxy lenses where the foreground deflector has an inclined disc, with a wide range of morphological types, from late-type spiral to lenticular. For each system, we compare the total mass inside the critical curve inferred from gravitational lens modelling to the stellar mass inferred from stellar population synthesis (SPS) models, computing the stellar mass fraction f* = M(SPS)/M(lens). We find that, for the lower mass SWELLS systems, adoption of a Salpeter stellar initial mass function (IMF) leads to estimates of f* that exceed 1. This is unphysical, and provides strong evidence against the Salpeter IMF being valid for these systems. Taking the lower mass end of the SWELLS sample sigma(SIE) < 230 km/s, we find that the IMF is lighter (in terms of stellar mass-to-light ratio) than Salpeter with 98% probability, and consistent with the Chabrier IMF and IMFs between the two. This result is consistent with previous studies of spiral galaxies based on independent techniques. In combination with recent studies of massive early-type galaxies that have favoured a heavier Salpeter-like IMF, this result strengthens the evidence against a universal stellar IMF.Comment: Accepted for publication in MNRAS. Some changes (none major) to address the referee's comments. 18 pages, 8 figure

    When can the Planck satellite measure spectral index running?

    Get PDF
    We use model selection forecasting to assess the ability of the Planck satellite to make a positive detection of spectral index running. We simulate Planck data for a range of assumed cosmological parameter values, and carry out a three-way Bayesian model comparison of a Harrison-Zel'dovich model, a power-law model, and a model including running. We find that Planck will be able to strongly support running only if its true value satisfies |dn/d ln k| > 0.02.Comment: 5 pages with 7 figures included. Full resolution PDF at http://astronomy.susx.ac.uk/~andrewl/planckev2D.pdf Minor updates to match version accepted by MNRA
    corecore