5,378 research outputs found
Background Dependent Lorentz Violation: Natural Solutions to the Theoretical Challenges of the OPERA Experiment
To explain both the OPERA experiment and all the known phenomenological
constraints/observations on Lorentz violation, the Background Dependent Lorentz
Violation (BDLV) has been proposed. We study the BDLV in a model independent
way, and conjecture that there may exist a "Dream Special Relativity Theory",
where all the Standard Model (SM) particles can be subluminal due to the
background effects. Assuming that the Lorentz violation on the Earth is much
larger than those on the interstellar scale, we automatically escape all the
astrophysical constraints on Lorentz violation. For the BDLV from the effective
field theory, we present a simple model and discuss the possible solutions to
the theoretical challenges of the OPERA experiment such as the Bremsstrahlung
effects for muon neutrinos and the pion decays. Also, we address the Lorentz
violation constraints from the LEP and KamLAMD experiments. For the BDLV from
the Type IIB string theory with D3-branes and D7-branes, we point out that the
D3-branes are flavour blind, and all the SM particles are the conventional
particles as in the traditional SM when they do not interact with the
D3-branes. Thus, we not only can naturally avoid all the known phenomenological
constraints on Lorentz violation, but also can naturally explain all the
theoretical challenges. Interestingly, the energy dependent photon velocities
may be tested at the experiments.Comment: RevTex4, 14 pages, minor corrections, references adde
High-Energy gamma-ray Astronomy and String Theory
There have been observations, first from the MAGIC Telescope (July 2005) and
quite recently (September 2008) from the FERMI Satellite Telescope, on
non-simultaneous arrival of high-energy photons from distant celestial sources.
In each case, the highest energy photons were delayed, as compared to their
lower-energy counterparts. Although the astrophysics at the source of these
energetic photons is still not understood, and such non simultaneous arrival
might be due to non simultaneous emission as a result of conventional physics
effects, nevertheless, rather surprisingly, the observed time delays can also
fit excellently some scenarios in quantum gravity, predicting Lorentz violating
space-time "foam" backgrounds with a non-trivial subluminal vacuum refractive
index suppressed linearly by a quantum gravity scale of the order of the
reduced Planck mass. In this pedagogical talk, I discuss the MAGIC and FERMI
findings in this context and I argue on a theoretical model of space-time foam
in string/brane theory that can accommodate the findings of those experiments
in agreement with all other stringent tests of Lorentz invariance. However, I
stress the current ambiguities/uncertainties on the source mechanisms, which
need to be resolved first before definite conclusions are reached regarding
quantum gravity foam scenarios.Comment: 34 pages latex, 12 eps figures incorporated, uses special macros.
Based on invited plenary talk at DICE 2008 Conference (Castiglioncello,
Italy), September 22-26 200
Stringy Space-Time Foam and High-Energy Cosmic Photons
In this review, I discuss briefly stringent tests of Lorentz-violating
quantum space-time foam models inspired from String/Brane theories, provided by
studies of high energy Photons from intense celestial sources, such as Active
Galactic Nuclei or Gamma Ray Bursts. The theoretical models predict
modifications to the radiation dispersion relations, which are quadratically
suppressed by the string mass scale, and time delays in the arrival times of
photons (assumed to be emitted more or less simultaneously from the source),
which are proportional to the photon energy, so that the more energetic photons
arrive later. Although the astrophysics at the source of these energetic
photons is still not understood, and such non simultaneous arrivals, that have
been observed recently, might well be due to non simultaneous emission as a
result of conventional physics effects, nevertheless, rather surprisingly, the
observed time delays can also fit excellently the stringy space-time foam
scenarios, provided the space-time defect foam is inhomogeneous. The key
features of the model, that allow it to evade a plethora of astrophysical
constraints on Lorentz violation, in sharp contrast to other field-theoretic
Lorentz-violating models of quantum gravity, are: (i) transparency of the foam
to electrons and in general charged matter, (ii) absence of birefringence
effects and (iii) a breakdown of the local effective lagrangian formalism.Comment: 26 pages Latex, 4 figures, uses special macros. Keynote Lecture in
the International Conference "Recent Developments in Gravity" (NEB14),
Ioannina (Greece) June 8-11 201
Mortality benefit of beta-blockade after successful elective percutaneous coronary intervention
AbstractObjectivesThe goal of this study was to evaluate the mortality benefit of beta-blockers after successful percutaneous coronary intervention (PCI).BackgroundBeta-blockers reduce mortality after myocardial infarction (MI), though limited data are available regarding their role after successful PCI.MethodsEach year from 1993 through 1999, the first 1,000 consecutive patients undergoing PCI were systematically followed up. Patients presenting with acute or recent MI, shock, or unsuccessful revascularization procedures were excluded from the analysis. Clinical, procedural, and follow-up data of beta-blocker-treated and non-beta-blocker-treated patients were compared. A multivariate survival analysis model using propensity analysis was used to adjust for heterogeneity between the two groups.ResultsOf the 4,553 patients, 2,056 (45%) were treated with beta-blockers at the time of the procedure. Beta-blocker therapy was associated with a mortality reduction from 1.3% to 0.8% at 30 days (p = 0.13) and a reduction from 6.0% to 3.9% at one year (p = 0.0014). This survival benefit of beta-blockers was independent of left ventricular function, diabetic status, history of hypertension, or history of MI. Using propensity analysis, beta-blocker therapy remained an independent predictor for one-year survival after PCI (hazard ratio, 0.63; 95% confidence interval, 0.46 to 0.87; p = 0.0054).ConclusionsWithin this large prospective registry, beta-blocker use was associated with a marked long-term survival benefit among patients undergoing successful elective percutaneous coronary revascularization
Probing quantum gravity using photons from a flare of the active galactic nucleus Markarian 501 observed by the MAGIC telescope
We analyze the timing of photons observed by the MAGIC telescope during a
flare of the active galactic nucleus Mkn 501 for a possible correlation with
energy, as suggested by some models of quantum gravity (QG), which predict a
vacuum refractive index \simeq 1 + (E/M_{QGn})^n, n = 1,2. Parametrizing the
delay between gamma-rays of different energies as \Delta t =\pm\tau_l E or
\Delta t =\pm\tau_q E^2, we find \tau_l=(0.030\pm0.012) s/GeV at the 2.5-sigma
level, and \tau_q=(3.71\pm2.57)x10^{-6} s/GeV^2, respectively. We use these
results to establish lower limits M_{QG1} > 0.21x10^{18} GeV and M_{QG2} >
0.26x10^{11} GeV at the 95% C.L. Monte Carlo studies confirm the MAGIC
sensitivity to propagation effects at these levels. Thermal plasma effects in
the source are negligible, but we cannot exclude the importance of some other
source effect.Comment: 12 pages, 3 figures, Phys. Lett. B, reflects published versio
An Alternative Interpretation of Statistical Mechanics
In this paper I propose an interpretation of classical statistical mechanics that centers on taking seriously the idea that probability measures represent complete states of statistical mechanical systems. I show how this leads naturally to the idea that the stochasticity of statistical mechanics is associated directly with the observables of the theory rather than with the microstates (as traditional accounts would have it). The usual assumption that microstates are representationally significant in the theory is therefore dispensable, a consequence which suggests interesting possibilities for developing non-equilibrium statistical mechanics and investigating inter-theoretic answers to the foundational questions of statistical mechanics
Quantifying loopy network architectures
Biology presents many examples of planar distribution and structural networks
having dense sets of closed loops. An archetype of this form of network
organization is the vasculature of dicotyledonous leaves, which showcases a
hierarchically-nested architecture containing closed loops at many different
levels. Although a number of methods have been proposed to measure aspects of
the structure of such networks, a robust metric to quantify their hierarchical
organization is still lacking. We present an algorithmic framework, the
hierarchical loop decomposition, that allows mapping loopy networks to binary
trees, preserving in the connectivity of the trees the architecture of the
original graph. We apply this framework to investigate computer generated
graphs, such as artificial models and optimal distribution networks, as well as
natural graphs extracted from digitized images of dicotyledonous leaves and
vasculature of rat cerebral neocortex. We calculate various metrics based on
the Asymmetry, the cumulative size distribution and the Strahler bifurcation
ratios of the corresponding trees and discuss the relationship of these
quantities to the architectural organization of the original graphs. This
algorithmic framework decouples the geometric information (exact location of
edges and nodes) from the metric topology (connectivity and edge weight) and it
ultimately allows us to perform a quantitative statistical comparison between
predictions of theoretical models and naturally occurring loopy graphs.Comment: 17 pages, 8 figures. During preparation of this manuscript the
authors became aware of the work of Mileyko at al., concurrently submitted
for publicatio
GLAST and Lorentz violation
We study possible Lorentz violations by means of gamma-ray bursts (GRB) with
special focus on the Large Array Telescope (LAT) of GLAST. We simulate bursts
with gtobssim and introduce a Lorentz violating term in the arrival times of
the photons. We further perturb these arrival times and energies with a
Gaussian distribution corresponding to the time resp. energy resolution of
GLAST. We then vary the photon flux in gtobssim in order to derive a relation
between the photon number and the standard deviation of the Lorentz violating
term. We conclude with the fact that our maximum likelihood method as first
developed in [1] is able to make a statement whether Nature breaks the Lorentz
symmetry if the number of bursts with known redshifts is of the order of 100.Comment: 13 pages, 8 figures and 2 tables, Accepted for publication by JCAP
after a couple of revision
A dual-chamber method for quantifying the effects of atmospheric perturbations on secondary organic aerosol formation from biomass burning emissions
Biomass burning (BB) is a major source of atmospheric pollutants. Field and laboratory studies indicate that secondary organic aerosol (SOA) formation from BB emissions is highly variable. We investigated sources of this variability using a novel dual-smog-chamber method that directly compares the SOA formation from the same BB emissions under two different atmospheric conditions. During each experiment, we filled two identical Teflon smog chambers simultaneously with BB emissions from the same fire. We then perturbed the smoke with UV lights, UV lights plus nitrous acid (HONO), or dark ozone in one or both chambers. These perturbations caused SOA formation in nearly every experiment with an average organic aerosol (OA) mass enhancement ratio of 1.78 ± 0.91 (mean ± 1σ). However, the effects of the perturbations were highly variable ranging with OA mass enhancement ratios ranging from 0.7 (30% loss of OA mass) to 4.4 across the set of perturbation experiments. There was no apparent relationship between OA enhancement and perturbation type, fuel type, and modified combustion efficiency. To better isolate the effects of different perturbations, we report dual-chamber enhancement (DUCE), which is the quantity of the effects of a perturbation relative to a reference condition. DUCE values were also highly variable, even for the same perturbation and fuel type. Gas measurements indicate substantial burn-to-burn variability in the magnitude and composition of SOA precursor emissions, even in repeated burns of the same fuel under nominally identical conditions. Therefore, the effects of different atmospheric perturbations on SOA formation from BB emissions appear to be less important than burn-to-burn variability
Multi-Messenger Gravitational Wave Searches with Pulsar Timing Arrays: Application to 3C66B Using the NANOGrav 11-year Data Set
When galaxies merge, the supermassive black holes in their centers may form
binaries and, during the process of merger, emit low-frequency gravitational
radiation in the process. In this paper we consider the galaxy 3C66B, which was
used as the target of the first multi-messenger search for gravitational waves.
Due to the observed periodicities present in the photometric and astrometric
data of the source of the source, it has been theorized to contain a
supermassive black hole binary. Its apparent 1.05-year orbital period would
place the gravitational wave emission directly in the pulsar timing band. Since
the first pulsar timing array study of 3C66B, revised models of the source have
been published, and timing array sensitivities and techniques have improved
dramatically. With these advances, we further constrain the chirp mass of the
potential supermassive black hole binary in 3C66B to less than using data from the NANOGrav 11-year data set. This
upper limit provides a factor of 1.6 improvement over previous limits, and a
factor of 4.3 over the first search done. Nevertheless, the most recent orbital
model for the source is still consistent with our limit from pulsar timing
array data. In addition, we are able to quantify the improvement made by the
inclusion of source properties gleaned from electromagnetic data to `blind'
pulsar timing array searches. With these methods, it is apparent that it is not
necessary to obtain exact a priori knowledge of the period of a binary to gain
meaningful astrophysical inferences.Comment: 14 pages, 6 figures. Accepted by Ap
- …