1,571 research outputs found
Efficient HTTP based I/O on very large datasets for high performance computing with the libdavix library
Remote data access for data analysis in high performance computing is
commonly done with specialized data access protocols and storage systems. These
protocols are highly optimized for high throughput on very large datasets,
multi-streams, high availability, low latency and efficient parallel I/O. The
purpose of this paper is to describe how we have adapted a generic protocol,
the Hyper Text Transport Protocol (HTTP) to make it a competitive alternative
for high performance I/O and data analysis applications in a global computing
grid: the Worldwide LHC Computing Grid. In this work, we first analyze the
design differences between the HTTP protocol and the most common high
performance I/O protocols, pointing out the main performance weaknesses of
HTTP. Then, we describe in detail how we solved these issues. Our solutions
have been implemented in a toolkit called davix, available through several
recent Linux distributions. Finally, we describe the results of our benchmarks
where we compare the performance of davix against a HPC specific protocol for a
data analysis use case.Comment: Presented at: Very large Data Bases (VLDB) 2014, Hangzho
The Spread of Bluetongue Virus Serotype 8 in Great Britain and Its Control by Vaccination
Bluetongue (BT) is a viral disease of ruminants transmitted by Culicoides biting midges and has the ability to spread rapidly over large distances. In the summer of 2006, BTV serotype 8 (BTV-8) emerged for the first time in northern Europe, resulting in over 2000 infected farms by the end of the year. The virus subsequently overwintered and has since spread across much of Europe, causing tens of thousands of livestock deaths. In August 2007, BTV-8 reached Great Britain (GB), threatening the large and valuable livestock industry. A voluntary vaccination scheme was launched in GB in May 2008 and, in contrast with elsewhere in Europe, there were no reported cases in GB during 2008.Here, we use carefully parameterised mathematical models to investigate the spread of BTV in GB and its control by vaccination. In the absence of vaccination, the model predicted severe outbreaks of BTV, particularly for warmer temperatures. Vaccination was predicted to reduce the severity of epidemics, with the greatest reduction achieved for high levels (95%) of vaccine uptake. However, even at this level of uptake the model predicted some spread of BTV. The sensitivity of the predictions to vaccination parameters (time to full protection in cattle, vaccine efficacy), the shape of the transmission kernel and temperature dependence in the transmission of BTV between farms was assessed.A combination of lower temperatures and high levels of vaccine uptake (>80%) in the previously-affected areas are likely to be the major contributing factors in the control achieved in England in 2008. However, low levels of vaccination against BTV-8 or the introduction of other serotypes could result in further, potentially severe outbreaks in future
The acceleration and storage of radioactive ions for a neutrino factory
The term beta-beam has been coined for the production of a pure beam of
electron neutrinos or their antiparticles through the decay of radioactive ions
circulating in a storage ring. This concept requires radioactive ions to be
accelerated to a Lorentz gamma of 150 for 6He and 60 for 18Ne. The neutrino
source itself consists of a storage ring for this energy range, with long
straight sections in line with the experiment(s). Such a decay ring does not
exist at CERN today, nor does a high-intensity proton source for the production
of the radioactive ions. Nevertheless, the existing CERN accelerator
infrastructure could be used as this would still represent an important saving
for a beta-beam facility. This paper outlines the first study, while some of
the more speculative ideas will need further investigations.Comment: Accepted for publication in proceedings of Nufact02, London, 200
Progression of MRI markers in cerebral small vessel disease: sample size considerations for clinical trials.
Detecting treatment efficacy using cognitive change in trials of cerebral small vessel disease (SVD) has been challenging, making the use of surrogate markers such as magnetic resonance imaging (MRI) attractive. We determined the sensitivity of MRI to change in SVD and used this information to calculate sample size estimates for a clinical trial. Data from the prospective SCANS (St George's Cognition and Neuroimaging in Stroke) study of patients with symptomatic lacunar stroke and confluent leukoaraiosis was used (n=121). Ninety-nine subjects returned at one or more time points. Multimodal MRI and neuropsychologic testing was performed annually over 3 years. We evaluated the change in brain volume, T2 white matter hyperintensity (WMH) volume, lacunes, and white matter damage on diffusion tensor imaging (DTI). Over 3 years, change was detectable in all MRI markers but not in cognitive measures. WMH volume and DTI parameters were most sensitive to change and therefore had the smallest sample size estimates. MRI markers, particularly WMH volume and DTI parameters, are more sensitive to SVD progression over short time periods than cognition. These markers could significantly reduce the size of trials to screen treatments for efficacy in SVD, although further validation from longitudinal and intervention studies is required.Journal of Cerebral Blood Flow & Metabolism advance online publication, 3 June 2015; doi:10.1038/jcbfm.2015.113
J/psi suppression at forward rapidity in Au+Au collisions at sqrt(s_NN)=39 and 62.4 GeV
We present measurements of the J/psi invariant yields in sqrt(s_NN)=39 and
62.4 GeV Au+Au collisions at forward rapidity (1.2<|y|<2.2). Invariant yields
are presented as a function of both collision centrality and transverse
momentum. Nuclear modifications are obtained for central relative to peripheral
Au+Au collisions (R_CP) and for various centrality selections in Au+Au relative
to scaled p+p cross sections obtained from other measurements (R_AA). The
observed suppression patterns at 39 and 62.4 GeV are quite similar to those
previously measured at 200 GeV. This similar suppression presents a challenge
to theoretical models that contain various competing mechanisms with different
energy dependencies, some of which cause suppression and others enhancement.Comment: 365 authors, 10 pages, 11 figures, 4 tables. Submitted to Phys. Rev.
C. Plain text data tables for the points plotted in figures for this and
previous PHENIX publications are (or will be) publicly available at
http://www.phenix.bnl.gov/papers.htm
Nuclear matter effects on production in asymmetric Cu+Au collisions at = 200 GeV
We report on production from asymmetric Cu+Au heavy-ion collisions
at =200 GeV at the Relativistic Heavy Ion Collider at both
forward (Cu-going direction) and backward (Au-going direction) rapidities. The
nuclear modification of yields in CuAu collisions in the Au-going
direction is found to be comparable to that in AuAu collisions when plotted
as a function of the number of participating nucleons. In the Cu-going
direction, production shows a stronger suppression. This difference is
comparable in magnitude and has the same sign as the difference expected from
shadowing effects due to stronger low- gluon suppression in the larger Au
nucleus. The relative suppression is opposite to that expected from hot nuclear
matter dissociation, since a higher energy density is expected in the Au-going
direction.Comment: 349 authors, 10 pages, 4 figures, and 4 tables. Submitted to Phys.
Rev. C. For v2, fixed LaTeX error in 3rd-to-last sentence. Plain text data
tables for the points plotted in figures for this and previous PHENIX
publications are (or will be) publicly available at
http://www.phenix.bnl.gov/papers.htm
Measurement of higher cumulants of net-charge multiplicity distributions in AuAu collisions at GeV
We report the measurement of cumulants () of the net-charge
distributions measured within pseudorapidity () in AuAu
collisions at GeV with the PHENIX experiment at the
Relativistic Heavy Ion Collider. The ratios of cumulants (e.g. ,
) of the net-charge distributions, which can be related to volume
independent susceptibility ratios, are studied as a function of centrality and
energy. These quantities are important to understand the quantum-chromodynamics
phase diagram and possible existence of a critical end point. The measured
values are very well described by expectation from negative binomial
distributions. We do not observe any nonmonotonic behavior in the ratios of the
cumulants as a function of collision energy. The measured values of and can be directly compared to lattice
quantum-chromodynamics calculations and thus allow extraction of both the
chemical freeze-out temperature and the baryon chemical potential at each
center-of-mass energy.Comment: 512 authors, 8 pages, 4 figures, 1 table. v2 is version accepted for
publication in Phys. Rev. C as a Rapid Communication. Plain text data tables
for the points plotted in figures for this and previous PHENIX publications
are (or will be) publicly available at http://www.phenix.bnl.gov/papers.htm
L\'evy-stable two-pion Bose-Einstein correlations in GeV AuAu collisions
We present a detailed measurement of charged two-pion correlation functions
in 0%-30% centrality GeV AuAu collisions by the
PHENIX experiment at the Relativistic Heavy Ion Collider. The data are well
described by Bose-Einstein correlation functions stemming from L\'evy-stable
source distributions. Using a fine transverse momentum binning, we extract the
correlation strength parameter , the L\'evy index of stability
and the L\'evy length scale parameter as a function of average
transverse mass of the pair . We find that the positively and the
negatively charged pion pairs yield consistent results, and their correlation
functions are represented, within uncertainties, by the same L\'evy-stable
source functions. The measurements indicate a decrease of the
strength of the correlations at low . The L\'evy length scale parameter
decreases with increasing , following a hydrodynamically
predicted type of scaling behavior. The values of the L\'evy index of stability
are found to be significantly lower than the Gaussian case of
, but also significantly larger than the conjectured value that may
characterize the critical point of a second-order quark-hadron phase
transition.Comment: 448 authors, 25 pages, 11 figures, 4 tables, 2010 data. v2 is version
accepted for publication in Phys. Rev. C. Plain text data tables for the
points plotted in figures for this and previous PHENIX publications are (or
will be) publicly available at http://www.phenix.bnl.gov/papers.htm
Cross sections and double-helicity asymmetries of midrapidity inclusive charged hadrons in p+p collisions at sqrt(s)=62.4 GeV
Unpolarized cross sections and double-helicity asymmetries of
single-inclusive positive and negative charged hadrons at midrapidity from p+p
collisions at sqrt(s)=62.4 GeV are presented. The PHENIX measurements for 1.0 <
p_T < 4.5 GeV/c are consistent with perturbative QCD calculations at
next-to-leading order in the strong coupling constant, alpha_s. Resummed pQCD
calculations including terms with next-to-leading-log accuracy, yielding
reduced theoretical uncertainties, also agree with the data. The
double-helicity asymmetry, sensitive at leading order to the gluon polarization
in a momentum-fraction range of 0.05 ~< x_gluon ~< 0.2, is consistent with
recent global parameterizations disfavoring large gluon polarization.Comment: PHENIX Collaboration. 447 authors, 12 pages, 5 figures, 5 tables.
Submitted to Physical Review
- …