10,546 research outputs found
Efficient CSL Model Checking Using Stratification
For continuous-time Markov chains, the model-checking problem with respect to
continuous-time stochastic logic (CSL) has been introduced and shown to be
decidable by Aziz, Sanwal, Singhal and Brayton in 1996. Their proof can be
turned into an approximation algorithm with worse than exponential complexity.
In 2000, Baier, Haverkort, Hermanns and Katoen presented an efficient
polynomial-time approximation algorithm for the sublogic in which only binary
until is allowed. In this paper, we propose such an efficient polynomial-time
approximation algorithm for full CSL. The key to our method is the notion of
stratified CTMCs with respect to the CSL property to be checked. On a
stratified CTMC, the probability to satisfy a CSL path formula can be
approximated by a transient analysis in polynomial time (using uniformization).
We present a measure-preserving, linear-time and -space transformation of any
CTMC into an equivalent, stratified one. This makes the present work the
centerpiece of a broadly applicable full CSL model checker. Recently, the
decision algorithm by Aziz et al. was shown to work only for stratified CTMCs.
As an additional contribution, our measure-preserving transformation can be
used to ensure the decidability for general CTMCs.Comment: 18 pages, preprint for LMCS. An extended abstract appeared in ICALP
201
Suspension of the fiber mode-cleaner launcher and measurement of the high extinction-ratio (10^{-9}) ellipsometer for the Q & A experiment
The Q & A experiment, first proposed and started in 1994, provides a feasible
way of exploring the quantum vacuum through the detection of vacuum
birefringence effect generated by QED loop diagram and the detection of the
polarization rotation effect generated by photon-interacting (pseudo-)scalar
particles. Three main parts of the experiment are: (1) Optics System (including
associated Electronic System) based on a suspended 3.5-m high finesse
Fabry-Perot cavity, (2) Ellipsometer using ultra-high extinction-ratio
polarizer and analyzer, and (3) Magnetic Field Modulation System for generating
the birefringence and the polarization rotation effect. In 2002, the Q & A
experiment achieved the Phase I sensitivity goal. During Phase II, we set (i)
to improve the control system of the cavity mirrors for suppressing the
relative motion noise, (ii) to enhance the birefringence signal by setting-up a
60-cm long 2.3 T transverse permanent magnet rotatable to 10 rev/s, (iii) to
reduce geometrical noise by inserting a polarization-maintaining optical fiber
(PM fiber) as a mode cleaner, and (iv) to use ultra-high extinction-ratio
(10^{-9}) polarizer and analyzer for ellipsometry. Here we report on (iii) &
(iv); specifically, we present the properties of the PM-fiber mode-cleaner, the
transfer function of its suspension system, and the result of our measurement
of high extinction-ratio polarizer and analyzer.Comment: 8 pages, 6 figures, presented in the 6th Edoardo Amaldi Conference on
Gravitational Waves, Okinawa, Japan, June 2005, and accepted by "Journal of
Physics: Conference Series". Modifications from version 2 were made based on
the referees' comments on figures. Ref. [31] were update
Ghost Busting: PT-Symmetric Interpretation of the Lee Model
The Lee model was introduced in the 1950s as an elementary quantum field
theory in which mass, wave function, and charge renormalization could be
carried out exactly. In early studies of this model it was found that there is
a critical value of g^2, the square of the renormalized coupling constant,
above which g_0^2, the square of the unrenormalized coupling constant, is
negative. Thus, for g^2 larger than this critical value, the Hamiltonian of the
Lee model becomes non-Hermitian. It was also discovered that in this
non-Hermitian regime a new state appears whose norm is negative. This state is
called a ghost state. It has always been assumed that in this ghost regime the
Lee model is an unacceptable quantum theory because unitarity appears to be
violated. However, in this regime while the Hamiltonian is not Hermitian, it
does possess PT symmetry. It has recently been discovered that a non-Hermitian
Hamiltonian having PT symmetry may define a quantum theory that is unitary. The
proof of unitarity requires the construction of a new time-independent operator
called C. In terms of C one can define a new inner product with respect to
which the norms of the states in the Hilbert space are positive. Furthermore,
it has been shown that time evolution in such a theory is unitary. In this
paper the C operator for the Lee model in the ghost regime is constructed
exactly in the V/N-theta sector. It is then shown that the ghost state has a
positive norm and that the Lee model is an acceptable unitary quantum field
theory for all values of g^2.Comment: 20 pages, 9 figure
Search for WW and WZ production in lepton plus jets final state at CDF
We present a search for WW and WZ production in final states that contain a charged lepton (electron or muon) and at least two jets, produced in sqrt(s) = 1.96 TeV ppbar collisions at the Fermilab Tevatron, using data corresponding to 1.2 fb-1 of integrated luminosity collected with the CDF II detector. Diboson production in this decay channel has yet to be observed at hadron colliders due to the large single W plus jets background. An artificial neural network has been developed to increase signal sensitivity, as compared with an event selection based on conventional cuts. We set a 95% confidence level upper limit of sigma_{WW}* BR(W->lnu,W->jets)+ sigma_{WZ}*BR(W->lnu,Z->jets)We present a search for WW and WZ production in final states that contain a charged lepton (electron or muon) and at least two jets, produced in √s=1.96 TeV pp̅ collisions at the Fermilab Tevatron, using data corresponding to 1.2 fb-1 of integrated luminosity collected with the CDF II detector. Diboson production in this decay channel has yet to be observed at hadron colliders due to the large single W plus jets background. An artificial neural network has been developed to increase signal sensitivity, as compared with an event selection based on conventional cuts. We set a 95% confidence level upper limit of σWW×BR(W→ℓνℓ,W→jets)+σWZ×BR(W→ℓνℓ,Z→jets)<2.88 pb, which is consistent with the standard model next-to-leading-order cross section calculation for this decay channel of 2.09±0.12 pb.Peer reviewe
Relationship between models of care and key rehabilitation milestones following unilateral transtibial amputation: a national cross-sectional study
General analysis of signals with two leptons and missing energy at the Large Hadron Collider
A signal of two leptons and missing energy is challenging to analyze at the
Large Hadron Collider (LHC) since it offers only few kinematical handles. This
signature generally arises from pair production of heavy charged particles
which each decay into a lepton and a weakly interacting stable particle. Here
this class of processes is analyzed with minimal model assumptions by
considering all possible combinations of spin 0, 1/2 or 1, and of weak
iso-singlets, -doublets or -triplets for the new particles. Adding to existing
work on mass and spin measurements, two new variables for spin determination
and an asymmetry for the determination of the couplings of the new particles
are introduced. It is shown that these observables allow one to independently
determine the spin and the couplings of the new particles, except for a few
cases that turn out to be indistinguishable at the LHC. These findings are
corroborated by results of an alternative analysis strategy based on an
automated likelihood test.Comment: 18 pages, 3 figures, LaTe
Current conservation in two-dimensional AC-transport
The electric current conservation in a two-dimensional quantum wire under a
time dependent field is investigated. Such a conservation is obtained as the
global density of states contribution to the emittance is balanced by the
contribution due to the internal charge response inside the sample. However
when the global partial density of states is approximately calculated using
scattering matrix only, correction terms are needed to obtain precise current
conservation. We have derived these corrections analytically using a specific
two-dimensional system. We found that when the incident energy is near the
first subband, our result reduces to the one-dimensional result. As
approaches to the -th subband with , the correction term diverges. This
explains the systematic deviation to precise current conservation observed in a
previous numerical calculation.Comment: 12 pages Latex, submitted to Phys. Rev.
On the Schoenberg Transformations in Data Analysis: Theory and Illustrations
The class of Schoenberg transformations, embedding Euclidean distances into
higher dimensional Euclidean spaces, is presented, and derived from theorems on
positive definite and conditionally negative definite matrices. Original
results on the arc lengths, angles and curvature of the transformations are
proposed, and visualized on artificial data sets by classical multidimensional
scaling. A simple distance-based discriminant algorithm illustrates the theory,
intimately connected to the Gaussian kernels of Machine Learning
A multi-year methane inversion using SCIAMACHY, accounting for systematic errors using TCCON measurements
This study investigates the use of total column CH<sub>4</sub> (<i>X</i>CH<sub>4</sub>) retrievals
from the SCIAMACHY satellite instrument for quantifying large-scale emissions
of methane. A unique data set from SCIAMACHY is available spanning almost a
decade of measurements, covering a period when the global CH<sub>4</sub> growth rate
showed a marked transition from stable to increasing mixing ratios. The TM5
4DVAR inverse modelling system has been used to infer CH<sub>4</sub> emissions from a
combination of satellite and surface measurements for the period 2003–2010.
In contrast to earlier inverse modelling studies, the SCIAMACHY retrievals
have been corrected for systematic errors using the TCCON network of ground-based Fourier transform spectrometers. The aim is to further investigate the
role of bias correction of satellite data in inversions. Methods for bias
correction are discussed, and the sensitivity of the optimized emissions to
alternative bias correction functions is quantified. It is found that the use
of SCIAMACHY retrievals in TM5 4DVAR increases the estimated inter-annual
variability of large-scale fluxes by 22% compared with the use of only
surface observations. The difference in global methane emissions between 2-year periods before and after July 2006 is estimated at 27–35 Tg yr<sup>−1</sup>. The use
of SCIAMACHY retrievals causes a shift in the emissions from the
extra-tropics to the tropics of 50 ± 25 Tg yr<sup>−1</sup>. The large uncertainty in
this value arises from the uncertainty in the bias correction functions.
Using measurements from the HIPPO and BARCA aircraft campaigns, we show that
systematic errors in the SCIAMACHY measurements are a main factor limiting
the performance of the inversions. To further constrain tropical emissions of
methane using current and future satellite missions, extended validation
capabilities in the tropics are of critical importance
Mean Interplanetary Magnetic Field Measurement Using the ARGO-YBJ Experiment
The sun blocks cosmic ray particles from outside the solar system, forming a
detectable shadow in the sky map of cosmic rays detected by the ARGO-YBJ
experiment in Tibet. Because the cosmic ray particles are positive charged, the
magnetic field between the sun and the earth deflects them from straight
trajectories and results in a shift of the shadow from the true location of the
sun. Here we show that the shift measures the intensity of the field which is
transported by the solar wind from the sun to the earth.Comment: 6 papges,3 figure
- …
