365 research outputs found
The Probability Density of the Higgs Boson Mass
The LEP Collaborations have reported a small excess of events in their
combined Higgs boson analysis at center of mass energies up to about 208 GeV.
In this communication, I present the result of a calculation of the probability
distribution function of the Higgs boson mass which can be rigorously obtained
if the validity of the Standard Model is assumed. It arises from the
combination of the most recent set of precision electroweak data and the
current results of the Higgs searches at LEP 2.Comment: 3 pages, 2 figure
Fitting Parton Distribution Data with Multiplicative Normalization Uncertainties
We consider the generic problem of performing a global fit to many
independent data sets each with a different overall multiplicative
normalization uncertainty. We show that the methods in common use to treat
multiplicative uncertainties lead to systematic biases. We develop a method
which is unbiased, based on a self--consistent iterative procedure. We
demonstrate the use of this method by applying it to the determination of
parton distribution functions with the NNPDF methodology, which uses a Monte
Carlo method for uncertainty estimation.Comment: 33 pages, 5 figures: published versio
Testing for rational speculative bubbles in the Brazilian residential real-estate market
Speculative bubbles have been occurring periodically in local or global real
estate markets and are considered a potential cause of economic crises. In this
context, the detection of explosive behaviors in the financial market and the
implementation of early warning diagnosis tests are of critical importance. The
recent increase in Brazilian housing prices has risen concerns that the
Brazilian economy may have a speculative housing bubble. In the present paper,
we employ a recently proposed recursive unit root test in order to identify
possible speculative bubbles in data from the Brazilian residential real-estate
market. The empirical results show evidence for speculative price bubbles both
in Rio de Janeiro and Sao Paulo, the two main Brazilian cities
Evidence for Neutrinoless Double Beta Decay
The data of the Heidelberg-Moscow double beta decay experiment for the
measuring period August 1990 - May 2000 (54.9813 kg y or 723.44 molyears),
published recently, are analyzed using the potential of the Bayesian method for
low counting rates. First evidence for neutrinoless double beta decay is
observed giving first evidence for lepton number violation. The evidence for
this decay mode is 97% (2.2\sigma) with the Bayesian method, and 99.8% c.l.
(3.1\sigma) with the method recommended by the Particle Data Group. The
half-life of the process is found with the Bayesian method to be T_{1/2}^{0\nu}
= (0.8 - 18.3) x 10^{25} y (95% c.l.) with a best value of 1.5 x 10^{25} y. The
deduced value of the effective neutrino mass is, with the nuclear matrix
elements from [Sta90,Tom91] = (0.11 - 0.56) eV (95% c.l.), with a best
value of 0.39 eV. Uncertainties in the nuclear matrix elements may widen the
range given for the effective neutrino mass by at most a factor 2. Our
observation which at the same time means evidence that the neutrino is a
Majorana particle, will be of fundamental importance for neutrino physics.
PACS. 14.69.Pq Neutrino mass and mixing; 23.40.Bw Weak-interaction and lepton
(including neutrino) aspects 23.40.-s Beta decay; double beta decay; electron
and muon capture.Comment: 14 pages, psfile, 7 figures, Published in Modern Physics Letters A,
Vol. 16, No. 37 (2001) 2409-2420, World Scientific Publishing Company, Home
Page: http://ejournals.wspc.com.sg/mpla/16/1637/S0217732301005825.html, Home
Page of Heidelberg Non-Accelerator Particle Physics Group:
http://www.mpi-hd.mpg.de/non_acc
Search for correlation between GRB's detected by BeppoSAX and gravitational wave detectors EXPLORER and NAUTILUS
Data obtained during five months of 2001 with the gravitational wave (GW)
detectors EXPLORER and NAUTILUS were studied in correlation with the gamma ray
burst data (GRB) obtained with the BeppoSAX satellite. During this period
BeppoSAX was the only GRB satellite in operation, while EXPLORER and NAUTILUS
were the only GW detectors in operation.
No correlation between the GW data and the GRB bursts was found. The
analysis, performed over 47 GRB's, excludes the presence of signals of
amplitude h >=1.2 * 10^{-18}, with 95 % probability, if we allow a time delay
between GW bursts and GRB within +-400 s, and h >= 6.5 * 10^{-19}, if the time
delay is within +- 5 s. The result is also provided in form of scaled
likelihood for unbiased interpretation and easier use for further analysis.Comment: 14 pages, 7 figures. Latex file, compiled with cernik.cls (provided
in the package
The Experimental Status of the Standard Electroweak Model at the End of the LEP-SLC Era
A method is proposed to calculate the confidence level for agreement of data
with the Standard Model (SM) by combining information from direct and indirect
Higgs Boson searches. Good agreement with the SM is found for
GeV using the observables most sensitive to : and . In
particular, quantum corrections, as predicted by the SM, are observed with a
statistical significance of forty-four standard deviations. However, apparent
deviations from the SM of 3.7 and 2.8 are found for the Z and right-handed Zb couplings respectively. The
maximum confidence level for agreement with the SM of the entire data set
considered is for GeV. The reason why
confidence levels about an order of magnitude higher than this have been
claimed for global fits to similar data sets is explained.Comment: 47 pages, 8 figures, 24 tables. An in-depth study of statistical
issues related to the comparison of precision EW data to the S
MaxEnt power spectrum estimation using the Fourier transform for irregularly sampled data applied to a record of stellar luminosity
The principle of maximum entropy is applied to the spectral analysis of a
data signal with general variance matrix and containing gaps in the record. The
role of the entropic regularizer is to prevent one from overestimating
structure in the spectrum when faced with imperfect data. Several arguments are
presented suggesting that the arbitrary prefactor should not be introduced to
the entropy term. The introduction of that factor is not required when a
continuous Poisson distribution is used for the amplitude coefficients. We
compare the formalism for when the variance of the data is known explicitly to
that for when the variance is known only to lie in some finite range. The
result of including the entropic measure factor is to suggest a spectrum
consistent with the variance of the data which has less structure than that
given by the forward transform. An application of the methodology to example
data is demonstrated.Comment: 15 pages, 13 figures, 1 table, major revision, final version,
Accepted for publication in Astrophysics & Space Scienc
Fitting a sum of exponentials to lattice correlation functions using a non-uniform prior
Excited states are extracted from lattice correlation functions using a
non-uniform prior on the model parameters. Models for both a single exponential
and a sum of exponentials are considered, as well as an alternate model for the
orthogonalization of the correlation functions. Results from an analysis of
torelon and glueball operators indicate the Bayesian methodology compares well
with the usual interpretation of effective mass tables produced by a
variational procedure. Applications of the methodology are discussed.Comment: 12 pages, 8 figures, 8 tables, major revision, final versio
Kepler Presearch Data Conditioning II - A Bayesian Approach to Systematic Error Correction
With the unprecedented photometric precision of the Kepler Spacecraft,
significant systematic and stochastic errors on transit signal levels are
observable in the Kepler photometric data. These errors, which include
discontinuities, outliers, systematic trends and other instrumental signatures,
obscure astrophysical signals. The Presearch Data Conditioning (PDC) module of
the Kepler data analysis pipeline tries to remove these errors while preserving
planet transits and other astrophysically interesting signals. The completely
new noise and stellar variability regime observed in Kepler data poses a
significant problem to standard cotrending methods such as SYSREM and TFA.
Variable stars are often of particular astrophysical interest so the
preservation of their signals is of significant importance to the astrophysical
community. We present a Bayesian Maximum A Posteriori (MAP) approach where a
subset of highly correlated and quiet stars is used to generate a cotrending
basis vector set which is in turn used to establish a range of "reasonable"
robust fit parameters. These robust fit parameters are then used to generate a
Bayesian Prior and a Bayesian Posterior Probability Distribution Function (PDF)
which when maximized finds the best fit that simultaneously removes systematic
effects while reducing the signal distortion and noise injection which commonly
afflicts simple least-squares (LS) fitting. A numerical and empirical approach
is taken where the Bayesian Prior PDFs are generated from fits to the light
curve distributions themselves.Comment: 43 pages, 21 figures, Submitted for publication in PASP. Also see
companion paper "Kepler Presearch Data Conditioning I - Architecture and
Algorithms for Error Correction in Kepler Light Curves" by Martin C. Stumpe,
et a
Transverse Enhancement Model and MiniBooNE Charge Current Quasi-Elastic Neutrino Scattering Data
Recently proposed Transverse Enhancement Model of nuclear effects in Charge
Current Quasi-Elastic neutrino scattering [A. Bodek, H. S. Budd, and M. E.
Christy, Eur. Phys. J. C{\bf 71} (2011) 1726] is confronted with the MiniBooNE
high statistics experimental data. It is shown that the {\it effective} large
axial mass model leads to better agreement with the data.Comment: 4 pages, 6 figure
- âŠ