6,467 research outputs found
Non-Commutative Gauge Theories and the Cosmological Constant
We discuss the issue of the cosmological constant in non-commutative
non-supersymmetric gauge theories. In particular, in orbifold field theories
non-commutativity acts as a UV cut-off. We suggest that in these theories
quantum corrections give rise to a vacuum energy \rho, that is controlled by
the non-commutativity parameter \theta, \rho ~ 1/theta^2 (only a soft
logarithmic dependence on the Planck scale survives). We demonstrate our claim
in a two-loop computation in field theory and by certain higher loop examples.
Based on general expressions from string theory, we suggest that the vacuum
energy is controlled by non-commutativity to all orders in perturbation theory.Comment: 11 pages, RevTex. 4 eps figures. v2: Typos corrected. To appear in
Phys.Rev.
Energy Dependence of Nuclear Transparency in C(p,2p) Scattering
The transparency of carbon for (p,2p) quasi-elastic events was measured at
beam energies ranging from 6 to 14.5 GeV at 90 degrees c.m. The four momentum
transfer squared q*q ranged from 4.8 to 16.9 (GeV/c)**2. We present the
observed energy dependence of the ratio of the carbon to hydrogen cross
sections. We also apply a model for the nuclear momentum distribution of carbon
to normalize this transparency ratio. We find a sharp rise in transparency as
the beam energy is increased to 9 GeV and a reduction to approximately the
Glauber level at higher energies.Comment: 4 pages, 2figures, submitted to PR
Interpretation and the Constraints on International Courts
This paper argues that methodologies of interpretation do not do what they promise – they do not constrain interpretation by providing neutral steps that one can follow in finding out a meaning of a text – but nevertheless do their constraining work by being part of what can be described as the legal practice
Dynamic Control of Laser Produced Proton Beams
The emission characteristics of intense laser driven protons are controlled
using ultra-strong (of the order of 10^9 V/m) electrostatic fields varying on a
few ps timescale. The field structures are achieved by exploiting the high
potential of the target (reaching multi-MV during the laser interaction).
Suitably shaped targets result in a reduction in the proton beam divergence,
and hence an increase in proton flux while preserving the high beam quality.
The peak focusing power and its temporal variation are shown to depend on the
target characteristics, allowing for the collimation of the inherently highly
divergent beam and the design of achromatic electrostatic lenses.Comment: 9 Pages, 5 figure
Can a matter-dominated model with constant bulk viscosity drive the accelerated expansion of the universe?
We test a cosmological model which the only component is a pressureless fluid
with a constant bulk viscosity as an explanation for the present accelerated
expansion of the universe. We classify all the possible scenarios for the
universe predicted by the model according to their past, present and future
evolution and we test its viability performing a Bayesian statistical analysis
using the SCP ``Union'' data set (307 SNe Ia), imposing the second law of
thermodynamics on the dimensionless constant bulk viscous coefficient \zeta and
comparing the predicted age of the universe by the model with the constraints
coming from the oldest globular clusters.
The best estimated values found for \zeta and the Hubble constant Ho are:
\zeta=1.922 \pm 0.089 and Ho=69.62 \pm 0.59 km/s/Mpc with a \chi^2=314. The age
of the universe is found to be 14.95 \pm 0.42 Gyr. We see that the estimated
value of Ho as well as of \chi^2 are very similar to those obtained from LCDM
model using the same SNe Ia data set. The estimated age of the universe is in
agreement with the constraints coming from the oldest globular clusters.
Moreover, the estimated value of \zeta is positive in agreement with the second
law of thermodynamics (SLT).
On the other hand, we perform different forms of marginalization over the
parameter Ho in order to study the sensibility of the results to the way how Ho
is marginalized. We found that it is almost negligible the dependence between
the best estimated values of the free parameters of this model and the way how
Ho is marginalized in the present work. Therefore, this simple model might be a
viable candidate to explain the present acceleration in the expansion of the
universe.Comment: 31 pages, 12 figures and 2 tables. Accepted to be published in the
Journal of Cosmology and Astroparticle Physics. Analysis using the new SCP
"Union" SNe Ia dataset instead of the Gold 2006 and ESSENCE datasets and
without changes in the conclusions. Added references. Related works:
arXiv:0801.1686 and arXiv:0810.030
Multiple-Scattering Series For Color Transparency
Color transparency CT depends on the formation of a wavepacket of small
spatial extent. It is useful to interpret experimental searches for CT with a
multiple scattering scattering series based on wavepacket-nucleon scattering
instead of the standard one using nucleon-nucleon scattering. We develop
several new techniques which are valid for differing ranges of energy. These
techniques are applied to verify some early approximations; study new forms of
the wave-packet-nucleon interaction; examine effects of treating wave packets
of non-zero size; and predict the production of 's in electron scattering
experiments.Comment: 26 pages, U.Wa. preprint 40427-23-N9
Testing of CP, CPT and causality violation with the light propagation in vacuum in presence of the uniform electric and magnetic fields
We have considered the structure of the fundamental symmetry violating part
of the photon refractive index in vacuum in the presence of constant electric
and magnetic fields. This part of the refractive index can, in principle,
contain CPT symmetry breaking terms. Some of the terms violate Lorentz
invariance, whereas the others violate locality and causality. Estimates of
these effects, using laser experiments are considered.Comment: 12 page
Noncommutative Quantum Mechanics and rotating frames
We study the effect of noncommutativity of space on the physics of a quantum
interferometer located in a rotating disk in a gauge field background. To this
end, we develop a path-integral approach which allows defining an effective
action from which relevant physical quantities can be computed as in the usual
commutative case. For the specific case of a constant magnetic field, we are
able to compute, exactly, the noncommutative Lagrangian and the associated
shift on the interference pattern for any value of .Comment: 17 pages, presentation improved, references added. To appear in
Physical Review
Obstructive sleep apnoea in obese adolescents and cardiometabolic risk markers
WHAT IS ALREADY KNOWN ABOUT THIS SUBJECT: In paediatric patients, obstructive sleep apnoea is associated with adiposity, especially visceral adiposity. In adults, obstructive sleep apnoea is also associated with a higher prevalence of cardiovascular disease and type 2 diabetes. There are limited and conflicting paediatric studies examining the association between obstructive sleep apnoea and biomarkers of risk for cardiovascular disease and type 2 diabetes in youth.
WHAT THIS STUDY ADDS: Obstructive sleep apnoea is linked with greater cardiometabolic risk markers in obese adolescents. Fasting insulin and homeostasis model assessment-insulin resistance may be especially linked with obstructive sleep apnoea among obese male Hispanic adolescents. The relationship between obstructive sleep apnoea and cardiometabolic abnormalities in obese adolescents should be considered when evaluating patients found to have obstructive sleep apnoea.
BACKGROUND: Paediatric studies examining the association between obstructive sleep apnoea (OSA) and insulin sensitivity/cardiometabolic risk are limited and conflicting.
OBJECTIVE: This study aims to determine if cardiometabolic risk markers are increased among obese youth with obstructive sleep apnoea as compared with their equally obese peers without OSA.
METHODS: We performed a retrospective analysis of 96 patients (age 14.2 ± 1.4 years) who underwent polysomnography for suspected OSA. Fasting lipids, glucose, insulin and haemoglobin A1 c (HbA1 c) were performed as part of routine clinical evaluation. Patients were categorized into two groups by degree of OSA as measured by the apnoea-hypopnoea index (AHI): none or mild OSA (AHI < 5) and moderate or severe OSA (AHI ≥ 5).
RESULTS: Despite the similar degrees of obesity, patients with moderate or severe OSA had higher fasting insulin (P = 0.037) and homeostasis model assessment-insulin resistance (HOMA-IR [P = 0.0497]) as compared with those with mild or no OSA. After controlling for body mass index, there was a positive association between the AHI and log HOMA-IR (P = 0.005). There was a positive relationship between arousals plus awakenings during the polysomnography and fasting triglycerides.
CONCLUSIONS: OSA is linked with greater cardiometabolic risk markers in obese youth
ProbCD: enrichment analysis accounting for categorization uncertainty
As in many other areas of science, systems biology makes extensive use of statistical association and significance estimates in contingency tables, a type of categorical data analysis known in this field as enrichment (also over-representation or enhancement) analysis. In spite of efforts to create probabilistic annotations, especially in the Gene Ontology context, or to deal with uncertainty in high throughput-based datasets, current enrichment methods largely ignore this probabilistic information since they are mainly based on variants of the Fisher Exact Test. We developed an open-source R package to deal with probabilistic categorical data analysis, ProbCD, that does not require a static contingency table. The contingency table for
the enrichment problem is built using the expectation of a Bernoulli Scheme stochastic process given the categorization probabilities. An on-line interface was created to allow usage by non-programmers and is available at: http://xerad.systemsbiology.net/ProbCD/. We present an analysis framework and software tools to address the issue of uncertainty in categorical data analysis. In particular, concerning the enrichment analysis, ProbCD can accommodate: (i) the stochastic nature of the high-throughput experimental techniques and (ii) probabilistic gene annotation
- …