3,494 research outputs found
MC generator TAUOLA: implementation of Resonance Chiral Theory for two and three meson modes. Comparison with experiment
We present a partial upgrade of the Monte Carlo event generator TAUOLA with
the two and three hadron decay modes using the theoretical models based on
Resonance Chiral Theory. These modes account for 88% of total hadronic width of
the tau meson. First results of the model parameters have been obtained using
BaBar data for three pion mode.Comment: 5 pages, 1 figure, contribution to the Proceedings of the QCD@Work12
Conferenc
Theoretical inputs and errors in the new hadronic currents in TAUOLA
The new hadronic currents implemented in the TAUOLA library are obtained in
the unified and consistent framework of Resonance Chiral Theory: a Lagrangian
approach in which the resonances exchanged in the hadronic tau decays are
active degrees of freedom included in a way that reproduces the low-energy
results of Chiral Perturbation Theory. The short-distance QCD constraints on
the imaginary part of the spin-one correlators yield relations among the
couplings that render the theory predictive.
In this communication, the obtaining of the two- and three-meson form factors
is sketched. One of the criticisms to our framework is that the error may be as
large as 1/3, since it is a realization of the large-N_C limit of QCD in a
meson theory. A number of arguments are given which disfavor that claim
pointing to smaller errors, which would explain the phenomenological success of
our description in these decays. Finally, other minor sources of error and
current improvements of the code are discussed.Comment: 5 pages, no figures, contribution to the Proceedings of the
QCD@Work12 Conferenc
In-vivo validity of proximal caries detection in primary teeth, with histological validation.
BACKGROUND: Detection and diagnosis of proximal caries in primary molars is challenging. AIM: The aim of this in-vivo study was to assess the validity and reproducibility of four methods of proximal caries detection in primary molar teeth. DESIGN: Eighty-two children (5-10 yrs) were recruited. Initially 1030 proximal surfaces were examined using meticulous visual examination (ICDAS) (VE1), bitewing radiographs (RE), and a laser fluorescence pen device (LF1). Temporary tooth separation (TTS) was achieved for 447 surfaces and these were re-examined visually (VE2) and using the LF-pen (LF2). Three hundred and fifty-six teeth (542 surfaces) were subsequently extracted and provided histological validation. RESULTS: At D1 (enamel and dentine caries) diagnostic threshold, the sensitivity of VE1, RE, VE2, LF1 and LF2 examination were 0.52, 0.14, 0.75, 0.58, 0.60 and the specificity values were 0.89, 0.97, 0.88, 0.85, 0.77 respectively. At D3 (dentine caries) threshold, the sensitivity values were 0.42, 0.71, 0.49, 0.63, 0.65 respectively, while specificity was 0.93 for VE1 and VE2, and 0.98, 0.87 and 0.88 for RE, LF1 and LF2 examinations respectively. ROC analysis showed radiographic examination to be superior at D3 . CONCLUSION: Meticulous caries diagnosis (ICDAS) should be supported by radiographs for detection of dentinal proximal caries in primary molars
Resonance Chiral Lagrangian Currents and Experimental Data for
In this paper we document the modifications introduced to the previous
version of the Resonance Chiral Lagrangian current ({\it Phys.Rev.} {\bf D86}
(2012) 113008) of the decay
which enable the one dimensional distributions measured by the BaBar
collaboration to be well modeled. The main change required to model the data is
the addition of the resonance. Systematic errors, theoretical and
experimental ones, limitations due to fits of one dimensional distributions
only, and resulting difficulties and statistical/systematic errors for fitted
parameters are addressed. The current and fitting environment is ready for
comparisons with the fully exclusive experimental data. The present result for
is encouraging for work on
other decay modes and Resonance Chiral Lagrangian based currents.Comment: 16 pages, 2 figure
Biomolecular imaging and electronic damage using X-ray free-electron lasers
Proposals to determine biomolecular structures from diffraction experiments
using femtosecond X-ray free-electron laser (XFEL) pulses involve a conflict
between the incident brightness required to achieve diffraction-limited atomic
resolution and the electronic and structural damage induced by the
illumination. Here we show that previous estimates of the conditions under
which biomolecular structures may be obtained in this manner are unduly
restrictive, because they are based on a coherent diffraction model that is not
appropriate to the proposed interaction conditions. A more detailed imaging
model derived from optical coherence theory and quantum electrodynamics is
shown to be far more tolerant of electronic damage. The nuclear density is
employed as the principal descriptor of molecular structure. The foundations of
the approach may also be used to characterize electrodynamical processes by
performing scattering experiments on complex molecules of known structure.Comment: 16 pages, 2 figure
Local influence of boundary conditions on a confined supercooled colloidal liquid
We study confined colloidal suspensions as a model system which approximates
the behavior of confined small molecule glass-formers. Dense colloidal
suspensions become glassier when confined between parallel glass plates. We use
confocal microscopy to study the motion of confined colloidal particles. In
particular, we examine the influence particles stuck to the glass plates have
on nearby free particles. Confinement appears to be the primary influence
slowing free particle motion, and proximity to stuck particles causes a
secondary reduction in the mobility of free particles. Overall, particle
mobility is fairly constant across the width of the sample chamber, but a
strong asymmetry in boundary conditions results in a slight gradient of
particle mobility.Comment: For conference proceedings, "Dynamics in Confinement", Grenoble,
March 201
Comparison and relative utility of inequality measurements: as applied to Scotland’s child dental health
This study compared and assessed the utility of tests of inequality on a series of very large population caries datasets. National cross-sectional caries datasets for Scotland’s 5-year-olds in 1993/94 (n = 5,078); 1995/96 (n = 6,240); 1997/98 (n = 6,584); 1999/00 (n = 6,781); 2002/03 (n = 9,747); 2003/04 (n = 10,956); 2005/06 (n = 10,945) and 2007/08 (n = 12,067) were obtained. Outcomes were based on the d3mft metric (i.e. the number of decayed, missing and filled teeth). An area-based deprivation category (DepCat) measured the subjects’ socioeconomic status (SES). Simple absolute and relative inequality, Odds Ratios and the Significant Caries Index (SIC) as advocated by the World Health Organization were calculated. The measures of complex inequality applied to data were: the Slope Index of Inequality (absolute) and a variety of relative inequality tests i.e. Gini coefficient; Relative Index of Inequality; concentration curve; Koolman and Doorslaer’s transformed Concentration Index; Receiver Operator Curve and Population Attributable Risk (PAR). Additional tests used were plots of SIC deciles (SIC10) and a Scottish Caries Inequality Metric (SCIM10). Over the period, mean d3mft improved from 3.1(95%CI 3.0–3.2) to 1.9(95%CI 1.8–1.9) and d3mft = 0% from 41.1(95%CI 39.8–42.3) to 58.3(95%CI 57.8–59.7). Absolute simple and complex inequality decreased. Relative simple and complex inequality remained comparatively stable. Our results support the use of the SII and RII to measure complex absolute and relative SES inequalities alongside additional tests of complex relative inequality such as PAR and Koolman and Doorslaer’s transformed CI. The latter two have clear interpretations which may influence policy makers. Specialised dental metrics (i.e. SIC, SIC10 and SCIM10) permit the exploration of other important inequalities not determined by SES, and could be applied to many other types of disease where ranking of morbidity is possible e.g. obesity. More generally, the approaches described may be applied to study patterns of health inequality affecting worldwide populations
Postextubation pulmonary edema: A case series and review
SummaryWe report a series of patients with postextubation pulmonary edema who had no obvious risk factors for the development of this syndrome.MethodsPatients identified by the pulmonary consultation service at an academic medical center were reviewed.ResultsFourteen cases were collected and analyzed. The average age was 34.5 years; 12 patients were male. The average BMI was 25.5. None had documented previous lung disease. Most operations were scheduled as outpatient procedures, and the type of surgery ranged from an incision and drainage of a bite wound to an open reduction- internal fixation of the radius. None of the patients had upper airway surgery. The length of surgeries ranged from 27 to 335min. Laryngospasm was the most commonly identified obstructing event postextubation. Treatment involved airway support when needed, supplemental oxygen, and diuretics.ConclusionsIt would appear that all patients, especially young men, are at risk for the development of this syndrome and that the pathogenesis remains uncertain in many cases
SDWFS-MT-1: A Self-Obscured Luminous Supernova at z~0.2
We report the discovery of a six-month-long mid-infrared transient,
SDWFS-MT-1 (aka SN 2007va), in the Spitzer Deep, Wide-Field Survey of the NOAO
Deep Wide-Field Survey Bootes field. The transient, located in a z=0.19 low
luminosity (M_[4.5]~-18.6 mag, L/L_MilkyWay~0.01) metal-poor (12+log(O/H)~7.8)
irregular galaxy, peaked at a mid-infrared absolute magnitude of M_[4.5]~-24.2
in the 4.5 micron Spitzer/IRAC band and emitted a total energy of at least
10^51 ergs. The optical emission was likely fainter than the mid-infrared,
although our constraints on the optical emission are poor because the transient
peaked when the source was "behind" the Sun. The Spitzer data are consistent
with emission by a modified black body with a temperature of ~1350 K. We rule
out a number of scenarios for the origin of the transient such as a Galactic
star, AGN activity, GRB, tidal disruption of a star by a black hole and
gravitational lensing. The most plausible scenario is a supernova exploding
inside a massive, optically thick circumstellar medium, composed of multiple
shells of previously ejected material. If the proposed scenario is correct,
then a significant fraction (~10%) of the most luminous supernova may be
self-enshrouded by dust not only before but also after the supernova occurs.
The spectral energy distribution of the progenitor of such a supernova would be
a slightly cooler version of eta Carina, peaking at 20-30 microns.Comment: 26 pages, 5 figures, 1 table, accepted for publication in Ap
ASCR/HEP Exascale Requirements Review Report
This draft report summarizes and details the findings, results, and
recommendations derived from the ASCR/HEP Exascale Requirements Review meeting
held in June, 2015. The main conclusions are as follows. 1) Larger, more
capable computing and data facilities are needed to support HEP science goals
in all three frontiers: Energy, Intensity, and Cosmic. The expected scale of
the demand at the 2025 timescale is at least two orders of magnitude -- and in
some cases greater -- than that available currently. 2) The growth rate of data
produced by simulations is overwhelming the current ability, of both facilities
and researchers, to store and analyze it. Additional resources and new
techniques for data analysis are urgently needed. 3) Data rates and volumes
from HEP experimental facilities are also straining the ability to store and
analyze large and complex data volumes. Appropriately configured
leadership-class facilities can play a transformational role in enabling
scientific discovery from these datasets. 4) A close integration of HPC
simulation and data analysis will aid greatly in interpreting results from HEP
experiments. Such an integration will minimize data movement and facilitate
interdependent workflows. 5) Long-range planning between HEP and ASCR will be
required to meet HEP's research needs. To best use ASCR HPC resources the
experimental HEP program needs a) an established long-term plan for access to
ASCR computational and data resources, b) an ability to map workflows onto HPC
resources, c) the ability for ASCR facilities to accommodate workflows run by
collaborations that can have thousands of individual members, d) to transition
codes to the next-generation HPC platforms that will be available at ASCR
facilities, e) to build up and train a workforce capable of developing and
using simulations and analysis to support HEP scientific research on
next-generation systems.Comment: 77 pages, 13 Figures; draft report, subject to further revisio
- …