72 research outputs found

    Covariant quantization of infinite spin particle models, and higher order gauge theories

    Full text link
    Further properties of a recently proposed higher order infinite spin particle model are derived. Infinitely many classically equivalent but different Hamiltonian formulations are shown to exist. This leads to a condition of uniqueness in the quantization process. A consistent covariant quantization is shown to exist. Also a recently proposed supersymmetric version for half-odd integer spins is quantized. A general algorithm to derive gauge invariances of higher order Lagrangians is given and applied to the infinite spin particle model, and to a new higher order model for a spinning particle which is proposed here, as well as to a previously given higher order rigid particle model. The latter two models are also covariantly quantized.Comment: 38 pages, Late

    String-localized Quantum Fields and Modular Localization

    Full text link
    We study free, covariant, quantum (Bose) fields that are associated with irreducible representations of the Poincar\'e group and localized in semi-infinite strings extending to spacelike infinity. Among these are fields that generate the irreducible representations of mass zero and infinite spin that are known to be incompatible with point-like localized fields. For the massive representation and the massless representations of finite helicity, all string-localized free fields can be written as an integral, along the string, of point-localized tensor or spinor fields. As a special case we discuss the string-localized vector fields associated with the point-like electromagnetic field and their relation to the axial gauge condition in the usual setting.Comment: minor correction

    Performance of novel VUV-sensitive Silicon Photo-Multipliers for nEXO

    Full text link
    Liquid xenon time projection chambers are promising detectors to search for neutrinoless double beta decay (0νββ\nu \beta \beta), due to their response uniformity, monolithic sensitive volume, scalability to large target masses, and suitability for extremely low background operations. The nEXO collaboration has designed a tonne-scale time projection chamber that aims to search for 0νββ\nu \beta \beta of \ce{^{136}Xe} with projected half-life sensitivity of 1.35×10281.35\times 10^{28}~yr. To reach this sensitivity, the design goal for nEXO is \leq1\% energy resolution at the decay QQ-value (2458.07±0.312458.07\pm 0.31~keV). Reaching this resolution requires the efficient collection of both the ionization and scintillation produced in the detector. The nEXO design employs Silicon Photo-Multipliers (SiPMs) to detect the vacuum ultra-violet, 175 nm scintillation light of liquid xenon. This paper reports on the characterization of the newest vacuum ultra-violet sensitive Fondazione Bruno Kessler VUVHD3 SiPMs specifically designed for nEXO, as well as new measurements on new test samples of previously characterised Hamamatsu VUV4 Multi Pixel Photon Counters (MPPCs). Various SiPM and MPPC parameters, such as dark noise, gain, direct crosstalk, correlated avalanches and photon detection efficiency were measured as a function of the applied over voltage and wavelength at liquid xenon temperature (163~K). The results from this study are used to provide updated estimates of the achievable energy resolution at the decay QQ-value for the nEXO design

    Taxonomy based on science is necessary for global conservation

    Get PDF
    Peer reviewe

    The Physics of the B Factories

    Get PDF

    A model-averaging approach to replication: the case of prep

    No full text
    The purpose of the recently proposed prep statistic is to estimate the probability of concurrence, that is, the probability that a replicate experiment yields an effect of the same sign (Killeen, 2005a). The influential journal Psychological Science endorses prep and recommends its use over that of traditional methods. Here we show that prep overestimates the probability of concurrence. This is because prep was derived under the assumption that all effect sizes in the population are equally likely a priori. In many situations, however, it is advisable also to entertain a null hypothesis of no or approximately no effect. We show how the posterior probability of the null hypothesis is sensitive to a priori considerations and to the evidence provided by the data; and the higher the posterior probability of the null hypothesis, the smaller the probability of concurrence. When the null hypothesis and the alternative hypothesis are equally likely a priori, prep may overestimate the probability of concurrence by 30% and more. We conclude that prep provides an upper bound on the probability of concurrence, a bound that brings with it the danger of having researchers believe that their experimental effects are much more reliable than they actually are

    The random effects prep continues to mispredict the probability of replication

    No full text
    In their reply, Lecoutre and Killeen (2010) argue for a random effects version of prep, in which the observed effect from one experiment is used to predict the probability that an effect from a different but related experiment will have the same sign. They present a figure giving the impression that this version of prep accurately predicts the probability of replication. We show that their results are incorrect and conceptually limited, even when corrected. We then present a meaningful evaluation of the random effects prep as a predictor and find that, as with the fixed effects prep, it performs very poorly
    corecore