4,836 research outputs found
Quantum Bit Commitment with a Composite Evidence
Entanglement-based attacks, which are subtle and powerful, are usually
believed to render quantum bit commitment insecure. We point out that the no-go
argument leading to this view implicitly assumes the evidence-of-commitment to
be a monolithic quantum system. We argue that more general evidence structures,
allowing for a composite, hybrid (classical-quantum) evidence, conduce to
improved security. In particular, we present and prove the security of the
following protocol: Bob sends Alice an anonymous state. She inscribes her
commitment by measuring part of it in the + (for ) or (for
) basis. She then communicates to him the (classical) measurement outcome
and the part-measured anonymous state interpolated into other, randomly
prepared qubits as her evidence-of-commitment.Comment: 6 pages, minor changes, journal reference adde
Quantum Communications with Compressed Decoherence Using Bright Squeezed Light
We propose a scheme for long-distance distribution of quantum entanglement in
which the entanglement between qubits at intermediate stations of the channel
is established by using bright light pulses in squeezed states coupled to the
qubits in cavities with a weak dispersive interaction. The fidelity of the
entanglement between qubits at the neighbor stations (10 km apart from each
other) obtained by postselection through the balanced homodyne detection of 7
dB squeezed pulses can reach F=0.99 without using entanglement purification, at
same time, the probability of successful generation of entanglement is 0.34.Comment: 4 pages, 2 figure
Minimum-error discrimination between symmetric mixed quantum states
We provide a solution of finding optimal measurement strategy for
distinguishing between symmetric mixed quantum states. It is assumed that the
matrix elements of at least one of the symmetric quantum states are all real
and nonnegative in the basis of the eigenstates of the symmetry operator.Comment: 10 page
Light atom quantum oscillations in UC and US
High energy vibrational scattering in the binary systems UC and US is
measured using time-of-flight inelastic neutron scattering. A clear set of
well-defined peaks equally separated in energy is observed in UC, corresponding
to harmonic oscillations of the light C atoms in a cage of heavy U atoms. The
scattering is much weaker in US and only a few oscillator peaks are visible. We
show how the difference between the materials can be understood by considering
the neutron scattering lengths and masses of the lighter atoms. Monte Carlo ray
tracing is used to simulate the scattering, with near quantitative agreement
with the data in UC, and some differences with US. The possibility of observing
anharmonicity and anisotropy in the potentials of the light atoms is
investigated in UC. Overall the observed data is well accounted for by
considering each light atom as a single atom isotropic quantum harmonic
oscillator.Comment: 10 pages, 8 figure
Universally valid reformulation of the Heisenberg uncertainty principle on noise and disturbance in measurement
The Heisenberg uncertainty principle states that the product of the noise in
a position measurement and the momentum disturbance caused by that measurement
should be no less than the limit set by Planck's constant, hbar/2, as
demonstrated by Heisenberg's thought experiment using a gamma-ray microscope.
Here I show that this common assumption is false: a universally valid trade-off
relation between the noise and the disturbance has an additional correlation
term, which is redundant when the intervention brought by the measurement is
independent of the measured object, but which allows the noise-disturbance
product much below Planck's constant when the intervention is dependent. A
model of measuring interaction with dependent intervention shows that
Heisenberg's lower bound for the noise-disturbance product is violated even by
a nearly nondisturbing, precise position measuring instrument. An experimental
implementation is also proposed to realize the above model in the context of
optical quadrature measurement with currently available linear optical devices.Comment: Revtex, 6 page
Feynman's Decoherence
Gell-Mann's quarks are coherent particles confined within a hadron at rest,
but Feynman's partons are incoherent particles which constitute a hadron moving
with a velocity close to that of light. It is widely believed that the quark
model and the parton model are two different manifestations of the same
covariant entity. If this is the case, the question arises whether the Lorentz
boost destroys coherence. It is pointed out that this is not the case, and it
is possible to resolve this puzzle without inventing new physics. It is shown
that this decoherence is due to the measurement processes which are less than
complete.Comment: RevTex 15 pages including 6 figs, presented at the 9th Int'l
Conference on Quantum Optics (Raubichi, Belarus, May 2002), to be published
in the proceeding
Global-local visual processing impacts risk taking behaviors, but only at first
We investigated the impact of early visual processing on decision-making during unpredictable, risky situations. Participants undertook Navon’s (1977) task and attended to either global letters or local letters only, following which they completed the Balloon Analogue Risk Task (BART). It was observed that global-focused individuals made more balloon pumps during the BART (i.e., took more risk), whereas local-focused individuals took less risk, albeit only initially. The theory of predictive and reactive control systems (PARCS) provides an excellent account of the data. Implications and future directions are discussed
Quantum Detection with Unknown States
We address the problem of distinguishing among a finite collection of quantum
states, when the states are not entirely known. For completely specified
states, necessary and sufficient conditions on a quantum measurement minimizing
the probability of a detection error have been derived. In this work, we assume
that each of the states in our collection is a mixture of a known state and an
unknown state. We investigate two criteria for optimality. The first is
minimization of the worst-case probability of a detection error. For the second
we assume a probability distribution on the unknown states, and minimize of the
expected probability of a detection error.
We find that under both criteria, the optimal detectors are equivalent to the
optimal detectors of an ``effective ensemble''. In the worst-case, the
effective ensemble is comprised of the known states with altered prior
probabilities, and in the average case it is made up of altered states with the
original prior probabilities.Comment: Refereed version. Improved numerical examples and figures. A few
typos fixe
Engineering squeezed states in high-Q cavities
While it has been possible to build fields in high-Q cavities with a high
degree of squeezing for some years, the engineering of arbitrary squeezed
states in these cavities has only recently been addressed [Phys. Rev. A 68,
061801(R) (2003)]. The present work examines the question of how to squeeze any
given cavity-field state and, particularly, how to generate the squeezed
displaced number state and the squeezed macroscopic quantum superposition in a
high-Q cavity
Risk for hepatocellular carcinoma with respect to hepatitis B virus genotypes B/C, specific mutations of enhancer II/core promoter/precore regions and HBV DNA levels
Background/aim: To examine the risks for hepatocellular carcinoma (HCC) with respect to hepatitis B virus (HBV) genotypes, specific viral mutations (MT), serum HBV DNA levels, and cirrhosis. Methods: HBV genotypes, 1653/1753/core promoter (CP)/precore MT and HBV DNA levels were determined in 248 HBV patients with HCC and 248 HBV controls. Results: Genotype C, CP-MT, T1653, HBV DNA levels ≥4 log 10 copies/ml and cirrhosis had a higher risk for HCC compared to patients with genotype B (p = 0.001, OR 1.9), CP wild-type (WT) (p<0.001, OR 4.1), C1653 (p = 0.028, OR 2.4), HBV DNA <4 log 10 copies/ml (p = 0.003, OR 2.1) and without cirrhosis (p<0.001, OR 4.0) respectively. Multivariate analysis showed that CP-MT, T1653, HBV DNA ≥4 log 10 copies/ml and cirrhosis were independent factors for HCC (all p<0.05). A receiver operating characteristics curve showed no cut-off HBV DNA level associated with minimal chance of HCC. Patients with CP-MT and cirrhosis had a 22.2-fold increased risk of HCC compared to patients with CP-WT and without cirrhosis. Patients with CP-MT and HBV DNA levels ≥4 log 10 copies/ml had a 7.2-fold increased risk of HCC compared to patients with CP-WT and HBV DNA levels <4 log 10 copies/ml. Patients with CP-MT and T1653 had a 9.9-fold increased risk of HCC compared to patients with wild-type for both regions. Conclusions: CP-MT, T1653, HBV DNA levels ≥4 log 10 copies/ml and cirrhosis are independent factors for development of HCC. The risks increased substantially in patients having these factors in combination.published_or_final_versio
- …