36,818 research outputs found
Bell's inequality and the coincidence-time loophole
This paper analyzes effects of time-dependence in the Bell inequality. A
generalized inequality is derived for the case when coincidence and
non-coincidence [and hence whether or not a pair contributes to the actual
data] is controlled by timing that depends on the detector settings. Needless
to say, this inequality is violated by quantum mechanics and could be violated
by experimental data provided that the loss of measurement pairs through
failure of coincidence is small enough, but the quantitative bound is more
restrictive in this case than in the previously analyzed "efficiency loophole."Comment: revtex4, 3 figures, v2: epl document class, reformatted w slight
change
On an Argument of David Deutsch
We analyse an argument of Deutsch, which purports to show that the
deterministic part of classical quantum theory together with deterministic
axioms of classical decision theory, together imply that a rational decision
maker behaves as if the probabilistic part of quantum theory (Born's law) is
true. We uncover two missing assumptions in the argument, and show that the
argument also works for an instrumentalist who is prepared to accept that the
outcome of a quantum measurement is random in the frequentist sense: Born's law
is a consequence of functional and unitary invariance principles belonging to
the deterministic part of quantum mechanics. Unfortunately, it turns out that
after the necessary corrections we have done no more than give an easier proof
of Gleason's theorem under stronger assumptions. However, for some special
cases the proof method gives positive results while using different assumptions
to Gleason. This leads to the conjecture that the proof could be improved to
give the same conclusion as Gleason under unitary invariance together with a
much weaker functional invariance condition.Comment: Revision 28-7-03: added reference Final revision 28-05-04. To appear
in proceedings of "Quantum Probability and Infinite Dimensional Analysis",
Greifswald, 2003; World Scientifi
Teleportation into Quantum Statistics
The paper is a tutorial introduction to quantum information theory,
developing the basic model and emphasizing the role of statistics and
probability.Comment: Been waiting 3 years for math.S
Better Bell inequalities (passion at a distance)
I explain so-called quantum nonlocality experiments and discuss how to
optimize them. Statistical tools from missing data maximum likelihood are
crucial. New results are given on CGLMP, CH and ladder inequalities. Open
problems are also discussed.Comment: Published at http://dx.doi.org/10.1214/074921707000000328 in the IMS
Lecture Notes Monograph Series
(http://www.imstat.org/publications/lecnotes.htm) by the Institute of
Mathematical Statistics (http://www.imstat.org
Statistics, Causality and Bell's Theorem
Bell's [Physics 1 (1964) 195-200] theorem is popularly supposed to establish
the nonlocality of quantum physics. Violation of Bell's inequality in
experiments such as that of Aspect, Dalibard and Roger [Phys. Rev. Lett. 49
(1982) 1804-1807] provides empirical proof of nonlocality in the real world.
This paper reviews recent work on Bell's theorem, linking it to issues in
causality as understood by statisticians. The paper starts with a proof of a
strong, finite sample, version of Bell's inequality and thereby also of Bell's
theorem, which states that quantum theory is incompatible with the conjunction
of three formerly uncontroversial physical principles, here referred to as
locality, realism and freedom. Locality is the principle that the direction of
causality matches the direction of time, and that causal influences need time
to propagate spatially. Realism and freedom are directly connected to
statistical thinking on causality: they relate to counterfactual reasoning, and
to randomisation, respectively. Experimental loopholes in state-of-the-art Bell
type experiments are related to statistical issues of post-selection in
observational studies, and the missing at random assumption. They can be
avoided by properly matching the statistical analysis to the actual
experimental design, instead of by making untestable assumptions of
independence between observed and unobserved variables. Methodological and
statistical issues in the design of quantum Randi challenges (QRC) are
discussed. The paper argues that Bell's theorem (and its experimental
confirmation) should lead us to relinquish not locality, but realism.Comment: Published in at http://dx.doi.org/10.1214/14-STS490 the Statistical
Science (http://www.imstat.org/sts/) by the Institute of Mathematical
Statistics (http://www.imstat.org
Schr\"odinger's cat meets Occam's razor
We discuss V.P. Belavkin's approach to the measurement problem encapsulated
in his theory of eventum mechanics (as presented in his 2007 survey). In
particular, we show its relation to ideas based on superselection and
interaction with the environment developed by N.P. Landsman (1995, and more
recent papers).
Landsman writes "those believing that the classical world exists
intrinsically and absolutely [such persons later termed by him B-realists] are
advised against reading this [his, 1995] paper". He adopts a milder position,
calling it that of an A-realist: we live in a classical world but to give it
special status is like insisting that the Earth is the centre of the universe.
The B-realists are accused of living under some kind of hallucination. Landsman
presents arguments pointing in a particular direction to a resolution of the
measurement problem which at least would satisfy the A-realists. We point out
in this paper that the theory earlier developed by Belavkin (surveyed in his
2007 paper) seems to complete Landsman's program or at least exhibits a
"realisation" satisfying his desiderata. At the same time it seems that this
completion of the program ends up giving both A- and B-realists equal licence
to accuse the others of living under hallucinations.Comment: This version: corrected the references, and put the original date of
submission on the title pag
- …