506 research outputs found
Prospects For Identifying Dark Matter With CoGeNT
It has previously been shown that the excess of events reported by the CoGeNT
collaboration could be generated by elastically scattering dark matter
particles with a mass of approximately 5-15 GeV. This mass range is very
similar to that required to generate the annual modulation observed by
DAMA/LIBRA and the gamma rays from the region surrounding the Galactic Center
identified within the data of the Fermi Gamma Ray Space Telescope. To
confidently conclude that CoGeNT's excess is the result of dark matter,
however, further data will likely be needed. In this paper, we make projections
for the first full year of CoGeNT data, and for its planned upgrade. Not only
will this body of data more accurately constrain the spectrum of nuclear recoil
events, and corresponding dark matter parameter space, but will also make it
possible to identify seasonal variations in the rate. In particular, if the
CoGeNT excess is the product of dark matter, then one year of CoGeNT data will
likely reveal an annual modulation with a significance of 2-3. The
planned CoGeNT upgrade will not only detect such an annual modulation with high
significance, but will be capable of measuring the energy spectrum of the
modulation amplitude. These measurements will be essential to irrefutably
confirming a dark matter origin of these events.Comment: 6 pages, 6 figure
From Tetraquark to Hexaquark: A Systematic Study of Heavy Exotics in the Large Limit
A systematic study of multiquark exotics with one or heavy quarks in
the large limit is presented. By binding a chiral soliton to a heavy
meson, either a normal -quark baryon or an exotic -quark baryon
is obtained. By replacing the heavy quark with heavy antiquarks, exotic
-quark and -quark mesons are obtained. When , they are
just the normal triquark baryon , the exotic pentaquark baryon , tetraquark di-meson and the hexaquark
di-baryon respectively. Their
stabilities and decays are also discussed. In particular, it is shown that the
``heavy to heavy'' semileptonic decays are described by the Isgur--Wise form
factors of the normal baryons.Comment: 14 pages in REVTeX, no Figure
Dependence of direct detection signals on the WIMP velocity distribution
The signals expected in WIMP direct detection experiments depend on the
ultra-local dark matter distribution. Observations probe the local density,
circular speed and escape speed, while simulations find velocity distributions
that deviate significantly from the standard Maxwellian distribution. We
calculate the energy, time and direction dependence of the event rate for a
range of velocity distributions motivated by recent observations and
simulations, and also investigate the uncertainty in the determination of WIMP
parameters. The dominant uncertainties are the systematic error in the local
circular speed and whether or not the MW has a high density dark disc. In both
cases there are substantial changes in the mean differential event rate and the
annual modulation signal, and hence exclusion limits and determinations of the
WIMP mass. The uncertainty in the shape of the halo velocity distribution is
less important, however it leads to a 5% systematic error in the WIMP mass. The
detailed direction dependence of the event rate is sensitive to the velocity
distribution. However the numbers of events required to detect anisotropy and
confirm the median recoil direction do not change substantially.Comment: 21 pages, 7 figures, v2 version to appear in JCAP, minor change
The design, construction and performance of the MICE scintillating fibre trackers
This is the Pre-print version of the Article. The official published version can be accessed from the link below - Copyright @ 2011 ElsevierCharged-particle tracking in the international Muon Ionisation Cooling Experiment (MICE) will be performed using two solenoidal spectrometers, each instrumented with a tracking detector based on diameter scintillating fibres. The design and construction of the trackers is described along with the quality-assurance procedures, photon-detection system, readout electronics, reconstruction and simulation software and the data-acquisition system. Finally, the performance of the MICE tracker, determined using cosmic rays, is presented.This work was supported by the Science and Technology Facilities Council under grant numbers PP/E003214/1, PP/E000479/1, PP/E000509/1, PP/E000444/1, and through SLAs with STFC-supported laboratories. This work was also supportedby the Fermi National Accelerator Laboratory, which is operated by the Fermi Research Alliance, under contract No. DE-AC02-76CH03000 with the U.S. Department of Energy, and by the U.S. National Science Foundation under grants PHY-0301737,PHY-0521313, PHY-0758173 and PHY-0630052. The authors also acknowledge the support of the World Premier International Research Center Initiative (WPI Initiative), MEXT, Japan
Results of Prevention of REStenosis with Tranilast and its Outcomes (PRESTO) trial
BACKGROUND: Restenosis after percutaneous coronary intervention (PCI) is a major problem affecting 15% to 30% of patients after stent placement. No oral agent has shown a beneficial effect on restenosis or on associated major adverse cardiovascular events. In limited trials, the oral agent tranilast has been shown to decrease the frequency of angiographic restenosis after PCI. METHODS AND RESULTS: In this double-blind, randomized, placebo-controlled trial of tranilast (300 and 450 mg BID for 1 or 3 months), 11 484 patients were enrolled. Enrollment and drug were initiated within 4 hours after successful PCI of at least 1 vessel. The primary end point was the first occurrence of death, myocardial infarction, or ischemia-driven target vessel revascularization within 9 months and was 15.8% in the placebo group and 15.5% to 16.1% in the tranilast groups (P=0.77 to 0.81). Myocardial infarction was the only component of major adverse cardiovascular events to show some evidence of a reduction with tranilast (450 mg BID for 3 months): 1.1% versus 1.8% with placebo (P=0.061 for intent-to-treat population). The primary reason for not completing treatment was > or =1 hepatic laboratory test abnormality (11.4% versus 0.2% with placebo, P<0.01). In the angiographic substudy composed of 2018 patients, minimal lumen diameter (MLD) was measured by quantitative coronary angiography. At follow-up, MLD was 1.76+/-0.77 mm in the placebo group, which was not different from MLD in the tranilast groups (1.72 to 1.78+/-0.76 to 80 mm, P=0.49 to 0.89). In a subset of these patients (n=1107), intravascular ultrasound was performed at follow-up. Plaque volume was not different between the placebo and tranilast groups (39.3 versus 37.5 to 46.1 mm(3), respectively; P=0.16 to 0.72). CONCLUSIONS: Tranilast does not improve the quantitative measures of restenosis (angiographic and intravascular ultrasound) or its clinical sequelae
Towards Machine Wald
The past century has seen a steady increase in the need of estimating and
predicting complex systems and making (possibly critical) decisions with
limited information. Although computers have made possible the numerical
evaluation of sophisticated statistical models, these models are still designed
\emph{by humans} because there is currently no known recipe or algorithm for
dividing the design of a statistical model into a sequence of arithmetic
operations. Indeed enabling computers to \emph{think} as \emph{humans} have the
ability to do when faced with uncertainty is challenging in several major ways:
(1) Finding optimal statistical models remains to be formulated as a well posed
problem when information on the system of interest is incomplete and comes in
the form of a complex combination of sample data, partial knowledge of
constitutive relations and a limited description of the distribution of input
random variables. (2) The space of admissible scenarios along with the space of
relevant information, assumptions, and/or beliefs, tend to be infinite
dimensional, whereas calculus on a computer is necessarily discrete and finite.
With this purpose, this paper explores the foundations of a rigorous framework
for the scientific computation of optimal statistical estimators/models and
reviews their connections with Decision Theory, Machine Learning, Bayesian
Inference, Stochastic Optimization, Robust Optimization, Optimal Uncertainty
Quantification and Information Based Complexity.Comment: 37 page
Detector Description and Performance for the First Coincidence Observations between LIGO and GEO
For 17 days in August and September 2002, the LIGO and GEO interferometer
gravitational wave detectors were operated in coincidence to produce their
first data for scientific analysis. Although the detectors were still far from
their design sensitivity levels, the data can be used to place better upper
limits on the flux of gravitational waves incident on the earth than previous
direct measurements. This paper describes the instruments and the data in some
detail, as a companion to analysis papers based on the first data.Comment: 41 pages, 9 figures 17 Sept 03: author list amended, minor editorial
change
Search for single top quarks in the tau+jets channel using 4.8 fb of collision data
We present the first direct search for single top quark production using tau
leptons. The search is based on 4.8 fb of integrated luminosity
collected in collisions at =1.96 TeV with the D0 detector
at the Fermilab Tevatron Collider. We select events with a final state
including an isolated tau lepton, missing transverse energy, two or three jets,
one or two of them tagged. We use a multivariate technique to discriminate
signal from background. The number of events observed in data in this final
state is consistent with the signal plus background expectation. We set in the
tau+jets channel an upper limit on the single top quark cross section of
\TauLimObs pb at the 95% C.L. This measurement allows a gain of 4% in expected
sensitivity for the observation of single top production when combining it with
electron+jets and muon+jets channels already published by the D0 collaboration
with 2.3 fb of data. We measure a combined cross section of
\SuperCombineXSall pb, which is the most precise measurement to date.Comment: 12 pages, 5 figure
Measurement of Z/gamma*+jet+X angular distributions in ppbar collisions at sqrt{s}=1.96 TeV
We present the first measurements at a hadron collider of differential cross
sections for Z+jet+X production in delta phi(Z, jet), |delta y(Z, jet)| and
|y_boost(Z, jet)|. Vector boson production in association with jets is an
excellent probe of QCD and constitutes the main background to many small cross
section processes, such as associated Higgs production. These measurements are
crucial tests of the predictions of perturbative QCD and current event
generators, which have varied success in describing the data. Using these
measurements as inputs in tuning event generators will increase the
experimental sensitivity to rare signals.Comment: Published in Physics Letters B 682 (2010), pp. 370-380. 15 pages, 6
figure
Search for the standard model Higgs boson in tau final states
We present a search for the standard model Higgs boson using hadronically
decaying tau leptons, in 1 inverse femtobarn of data collected with the D0
detector at the Fermilab Tevatron ppbar collider. We select two final states:
tau plus missing transverse energy and b jets, and tau+ tau- plus jets. These
final states are sensitive to a combination of associated W/Z boson plus Higgs
boson, vector boson fusion and gluon-gluon fusion production processes. The
observed ratio of the combined limit on the Higgs production cross section at
the 95% C.L. to the standard model expectation is 29 for a Higgs boson mass of
115 GeV.Comment: publication versio
- âŠ