5,851 research outputs found

    Polling bias and undecided voter allocations: US Presidential elections, 2004 - 2016

    Full text link
    Accounting for undecided and uncertain voters is a challenging issue for predicting election results from public opinion polls. Undecided voters typify the uncertainty of swing voters in polls but are often ignored or allocated to each candidate in a simple, deterministic manner. Historically this may have been adequate because the undecided were comparatively small enough to assume that they do not affect the relative proportions of the decided voters. However, in the presence of high numbers of undecided voters, these static rules may in fact bias election predictions from election poll authors and meta-poll analysts. In this paper, we examine the effect of undecided voters in the 2016 US presidential election to the previous three presidential elections. We show there were a relatively high number of undecided voters over the campaign and on election day, and that the allocation of undecided voters in this election was not consistent with two-party proportional (or even) allocations. We find evidence that static allocation regimes are inadequate for election prediction models and that probabilistic allocations may be superior. We also estimate the bias attributable to polling agencies, often referred to as "house effects".Comment: 32 pages, 9 figures, 6 table

    Cosmology with the lights off: Standard sirens in the Einstein Telescope era

    Full text link
    We explore the prospects for constraining cosmology using gravitational-wave (GW) observations of neutron-star binaries by the proposed Einstein Telescope (ET), exploiting the narrowness of the neutron-star mass function. Double neutron-star (DNS) binaries are expected to be one of the first sources detected after "first-light" of Advanced LIGO and are expected to be detected at a rate of a few tens per year in the advanced era. However the proposed ET could catalog tens of thousands per year. Combining the measured source redshift distributions with GW-network distance determinations will permit not only the precision measurement of background cosmological parameters, but will provide an insight into the astrophysical properties of these DNS systems. Of particular interest will be to probe the distribution of delay times between DNS-binary creation and subsequent merger, as well as the evolution of the star-formation rate density within ET's detection horizon. Keeping H_0, \Omega_{m,0} and \Omega_{\Lambda,0} fixed and investigating the precision with which the dark-energy equation-of-state parameters could be recovered, we found that with 10^5 detected DNS binaries we could constrain these parameters to an accuracy similar to forecasted constraints from future CMB+BAO+SNIa measurements. Furthermore, modeling the merger delay-time distribution as a power-law, and the star-formation rate (SFR) density as a parametrized version of the Porciani and Madau SF2 model, we find that the associated astrophysical parameters are constrained to within ~ 10%. All parameter precisions scaled as 1/sqrt(N), where N is the number of cataloged detections. We also investigated how precisions varied with the intrinsic underlying properties of the Universe and with the distance reach of the network (which may be affected by the low-frequency cutoff of the detector).Comment: 24 pages, 11 figures, 6 tables. Minor changes to reflect published version. References updated and correcte

    Domains and naïve theories

    Full text link
    Human cognition entails domain‐specific cognitive processes that influence memory, attention, categorization, problem‐solving, reasoning, and knowledge organization. This article examines domain‐specific causal theories, which are of particular interest for permitting an examination of how knowledge structures change over time. We first describe the properties of commonsense theories, and how commonsense theories differ from scientific theories, illustrating with children's classification of biological and nonbiological kinds. We next consider the implications of domain‐specificity for broader issues regarding cognitive development and conceptual change. We then examine the extent to which domain‐specific theories interact, and how people reconcile competing causal frameworks. Future directions for research include examining how different content domains interact, the nature of theory change, the role of context (including culture, language, and social interaction) in inducing different frameworks, and the neural bases for domain‐specific reasoning. WIREs Cogni Sci 2011 2 490–502 DOI: 10.1002/wcs.124 For further resources related to this article, please visit the WIREs websitePeer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/87128/1/124_ftp.pd

    Matter formed at the BNL relativistic heavy ion collider

    Full text link
    We suggest that the "new form of matter" found just above TcT_c by RHIC is made up of tightly bound quark-antiquark pairs, essentially 32 chirally restored (more precisely, nearly massless) mesons of the quantum numbers of π\pi, σ\sigma, ρ\rho and a1a_1. Taking the results of lattice gauge simulations (LGS) for the color Coulomb potential from the work of the Bielefeld group and feeding this into a relativistic two-body code, after modifying the heavy-quark lattice results so as to include the velocity-velocity interaction, all ground-state eigenvalues of the 32 mesons go to zero at TcT_c just as they do from below TcT_c as predicted by the vector manifestation (VM in short) of hidden local symmetry. This could explain the rapid rise in entropy up to TcT_c found in LGS calculations. We argue that how the dynamics work can be understood from the behavior of the hard and soft glue.Comment: Final versio

    Radiative neutron capture on a proton at BBN energies

    Full text link
    The total cross section for radiative neutron capture on a proton, npdγnp \to d \gamma, is evaluated at big bang nucleosynthesis (BBN) energies. The electromagnetic transition amplitudes are calculated up to next-to leading order within the framework of pionless effective field theory with dibaryon fields. We also calculate the dγnpd\gamma\to np cross section and the photon analyzing power for the dγnpd\vec{\gamma}\to np process from the amplitudes. The values of low energy constants that appear in the amplitudes are estimated by a Markov Chain Monte Carlo analysis using the relevant low energy experimental data. Our result agrees well with those of other theoretical calculations except for the npdγnp\to d\gamma cross section at some energies estimated by an R-matrix analysis. We also study the uncertainties in our estimation of the npdγnp\to d\gamma cross section at relevant BBN energies and find that the estimated cross section is reliable to within \sim1% error.Comment: 21 pages and 12 eps figures; 6 eps figures and 2 references added, and accepted for publication in Phys. Rev.

    Half a million excess deaths in the Iraq war:Terms and conditions may apply

    Get PDF
    Hagopian et al. (2013) published a headline-grabbing estimate for the Iraq war of half a million excess deaths , i.e. deaths that would not have happened without the war. We reanalyse the data from the University Collaborative Iraq Mortality Study and refute their dramatic claim. The Hagopian et al. (2013) estimate has four main defects: i) most importantly, it conflates non-violent deaths with violent ones; ii) it fails to account for the stratified sampling design of the UCIMS; iii) it fully includes all reported deaths regardless of death certificate backing, even when respondents say they have a death certificate but cannot produce one when prompted; iv) it adds approximately 100,000 speculative deaths not supported by data. Thus, we reject the 500,000 estimate. Indeed, we find that the UCIMS data cannot even support a claim that the number of non-violent excess deaths in the Iraq war has been greater than zero. We recommend future research to follow our methodological lead in two main directions; supplement traditional excess death estimates with excess death estimates for non-violent deaths alone, and use differences-in-differences estimates to uncover the relationship between violence and non-violent death rates

    Harold Jeffreys's Theory of Probability Revisited

    Full text link
    Published exactly seventy years ago, Jeffreys's Theory of Probability (1939) has had a unique impact on the Bayesian community and is now considered to be one of the main classics in Bayesian Statistics as well as the initiator of the objective Bayes school. In particular, its advances on the derivation of noninformative priors as well as on the scaling of Bayes factors have had a lasting impact on the field. However, the book reflects the characteristics of the time, especially in terms of mathematical rigor. In this paper we point out the fundamental aspects of this reference work, especially the thorough coverage of testing problems and the construction of both estimation and testing noninformative priors based on functional divergences. Our major aim here is to help modern readers in navigating in this difficult text and in concentrating on passages that are still relevant today.Comment: This paper commented in: [arXiv:1001.2967], [arXiv:1001.2968], [arXiv:1001.2970], [arXiv:1001.2975], [arXiv:1001.2985], [arXiv:1001.3073]. Rejoinder in [arXiv:0909.1008]. Published in at http://dx.doi.org/10.1214/09-STS284 the Statistical Science (http://www.imstat.org/sts/) by the Institute of Mathematical Statistics (http://www.imstat.org

    Bayesian coherent analysis of in-spiral gravitational wave signals with a detector network

    Full text link
    The present operation of the ground-based network of gravitational-wave laser interferometers in "enhanced" configuration brings the search for gravitational waves into a regime where detection is highly plausible. The development of techniques that allow us to discriminate a signal of astrophysical origin from instrumental artefacts in the interferometer data and to extract the full range of information are some of the primary goals of the current work. Here we report the details of a Bayesian approach to the problem of inference for gravitational wave observations using a network of instruments, for the computation of the Bayes factor between two hypotheses and the evaluation of the marginalised posterior density functions of the unknown model parameters. The numerical algorithm to tackle the notoriously difficult problem of the evaluation of large multi-dimensional integrals is based on a technique known as Nested Sampling, which provides an attractive alternative to more traditional Markov-chain Monte Carlo (MCMC) methods. We discuss the details of the implementation of this algorithm and its performance against a Gaussian model of the background noise, considering the specific case of the signal produced by the in-spiral of binary systems of black holes and/or neutron stars, although the method is completely general and can be applied to other classes of sources. We also demonstrate the utility of this approach by introducing a new coherence test to distinguish between the presence of a coherent signal of astrophysical origin in the data of multiple instruments and the presence of incoherent accidental artefacts, and the effects on the estimation of the source parameters as a function of the number of instruments in the network.Comment: 22 page
    corecore