72 research outputs found
Covariant quantization of infinite spin particle models, and higher order gauge theories
Further properties of a recently proposed higher order infinite spin particle
model are derived. Infinitely many classically equivalent but different
Hamiltonian formulations are shown to exist. This leads to a condition of
uniqueness in the quantization process. A consistent covariant quantization is
shown to exist. Also a recently proposed supersymmetric version for half-odd
integer spins is quantized. A general algorithm to derive gauge invariances of
higher order Lagrangians is given and applied to the infinite spin particle
model, and to a new higher order model for a spinning particle which is
proposed here, as well as to a previously given higher order rigid particle
model. The latter two models are also covariantly quantized.Comment: 38 pages, Late
String-localized Quantum Fields and Modular Localization
We study free, covariant, quantum (Bose) fields that are associated with
irreducible representations of the Poincar\'e group and localized in
semi-infinite strings extending to spacelike infinity. Among these are fields
that generate the irreducible representations of mass zero and infinite spin
that are known to be incompatible with point-like localized fields. For the
massive representation and the massless representations of finite helicity, all
string-localized free fields can be written as an integral, along the string,
of point-localized tensor or spinor fields. As a special case we discuss the
string-localized vector fields associated with the point-like electromagnetic
field and their relation to the axial gauge condition in the usual setting.Comment: minor correction
Performance of novel VUV-sensitive Silicon Photo-Multipliers for nEXO
Liquid xenon time projection chambers are promising detectors to search for
neutrinoless double beta decay (0), due to their response
uniformity, monolithic sensitive volume, scalability to large target masses,
and suitability for extremely low background operations. The nEXO collaboration
has designed a tonne-scale time projection chamber that aims to search for
0 of \ce{^{136}Xe} with projected half-life sensitivity of
~yr. To reach this sensitivity, the design goal for nEXO is
1\% energy resolution at the decay -value (~keV).
Reaching this resolution requires the efficient collection of both the
ionization and scintillation produced in the detector. The nEXO design employs
Silicon Photo-Multipliers (SiPMs) to detect the vacuum ultra-violet, 175 nm
scintillation light of liquid xenon. This paper reports on the characterization
of the newest vacuum ultra-violet sensitive Fondazione Bruno Kessler VUVHD3
SiPMs specifically designed for nEXO, as well as new measurements on new test
samples of previously characterised Hamamatsu VUV4 Multi Pixel Photon Counters
(MPPCs). Various SiPM and MPPC parameters, such as dark noise, gain, direct
crosstalk, correlated avalanches and photon detection efficiency were measured
as a function of the applied over voltage and wavelength at liquid xenon
temperature (163~K). The results from this study are used to provide updated
estimates of the achievable energy resolution at the decay -value for the
nEXO design
A model-averaging approach to replication: the case of prep
The purpose of the recently proposed prep statistic is to estimate the probability of concurrence, that is, the probability that a replicate experiment yields an effect of the same sign (Killeen, 2005a). The influential journal Psychological Science endorses prep and recommends its use over that of traditional methods. Here we show that prep overestimates the probability of concurrence. This is because prep was derived under the assumption that all effect sizes in the population are equally likely a priori. In many situations, however, it is advisable also to entertain a null hypothesis of no or approximately no effect. We show how the posterior probability of the null hypothesis is sensitive to a priori considerations and to the evidence provided by the data; and the higher the posterior probability of the null hypothesis, the smaller the probability of concurrence. When the null hypothesis and the alternative hypothesis are equally likely a priori, prep may overestimate the probability of concurrence by 30% and more. We conclude that prep provides an upper bound on the probability of concurrence, a bound that brings with it the danger of having researchers believe that their experimental effects are much more reliable than they actually are
The random effects prep continues to mispredict the probability of replication
In their reply, Lecoutre and Killeen (2010) argue for a random effects version of prep, in which the observed effect from one experiment is used to predict the probability that an effect from a different but related experiment will have the same sign. They present a figure giving the impression that this version of prep accurately predicts the probability of replication. We show that their results are incorrect and conceptually limited, even when corrected. We then present a meaningful evaluation of the random effects prep as a predictor and find that, as with the fixed effects prep, it performs very poorly
- …