34 research outputs found
Quantum mechanics as a theory of probability
We develop and defend the thesis that the Hilbert space formalism of quantum
mechanics is a new theory of probability. The theory, like its classical
counterpart, consists of an algebra of events, and the probability measures
defined on it. The construction proceeds in the following steps: (a) Axioms for
the algebra of events are introduced following Birkhoff and von Neumann. All
axioms, except the one that expresses the uncertainty principle, are shared
with the classical event space. The only models for the set of axioms are
lattices of subspaces of inner product spaces over a field K. (b) Another axiom
due to Soler forces K to be the field of real, or complex numbers, or the
quaternions. We suggest a probabilistic reading of Soler's axiom. (c) Gleason's
theorem fully characterizes the probability measures on the algebra of events,
so that Born's rule is derived. (d) Gleason's theorem is equivalent to the
existence of a certain finite set of rays, with a particular orthogonality
graph (Wondergraph). Consequently, all aspects of quantum probability can be
derived from rational probability assignments to finite "quantum gambles". We
apply the approach to the analysis of entanglement, Bell inequalities, and the
quantum theory of macroscopic objects. We also discuss the relation of the
present approach to quantum logic, realism and truth, and the measurement
problem.Comment: 37 pages, 3 figures. Forthcoming in a Festschrift for Jeffrey Bub,
ed. W. Demopoulos and the author, Springer (Kluwer): University of Western
Ontario Series in Philosophy of Scienc
Generalizations of Kochen and Specker's Theorem and the Effectiveness of Gleason's Theorem
Kochen and Specker's theorem can be seen as a consequence of Gleason's
theorem and logical compactness. Similar compactness arguments lead to stronger
results about finite sets of rays in Hilbert space, which we also prove by a
direct construction. Finally, we demonstrate that Gleason's theorem itself has
a constructive proof, based on a generic, finite, effectively generated set of
rays, on which every quantum state can be approximated.Comment: 14 pages, 6 figures, read at the Robert Clifton memorial conferenc
The Logic Of Fundamental Processes: Nonmeasurable Sets And Quantum Mechanics
Quantum theory has played a significant role in modern philosophy both as a source of metaphysical ideas and as an important example of a \u27scientific revolution\u27. In spite of the sixty or so years that have elapsed since its invention, a long lasting controversy concerning the interpretation and meaning of quantum theory prevails. Almost all authors, however, seem to agree on one major point, namely, that there could be no interpretation of this theory which is both realistic and local.;The purpose of this thesis is to demonstrate that this premiss is false and that a realistic, local and deterministic interpretation of quantum theory (at least of part of it) does exist, provided that we extend the classical concept of probability.;In order to establish this a \u27quasi classical\u27 probability theory is developed based on some non Lebesgue measurable \u27events\u27, which is then applied to account for spin-statistics. Finally I note how this model reflects on the problems of physical realism, locality, the status of probability theory and the philosophical foundations of mathematics
Two dogmas about quantum mechanics
We argue that the intractable part of the measurement problem -- the 'big'
measurement problem -- is a pseudo-problem that depends for its legitimacy on
the acceptance of two dogmas. The first dogma is John Bell's assertion that
measurement should never be introduced as a primitive process in a fundamental
mechanical theory like classical or quantum mechanics, but should always be
open to a complete analysis, in principle, of how the individual outcomes come
about dynamically. The second dogma is the view that the quantum state has an
ontological significance analogous to the significance of the classical state
as the 'truthmaker' for propositions about the occurrence and non-occurrence of
events, i.e., that the quantum state is a representation of physical reality.
We show how both dogmas can be rejected in a realist information-theoretic
interpretation of quantum mechanics as an alternative to the Everett
interpretation. The Everettian, too, regards the 'big' measurement problem as a
pseudo-problem, because the Everettian rejects the assumption that measurements
have definite outcomes, in the sense that one particular outcome, as opposed to
other possible outcomes, actually occurs in a quantum measurement process. By
contrast with the Everettians, we accept that measurements have definite
outcomes. By contrast with the Bohmians and the GRW 'collapse' theorists who
add structure to the theory and propose dynamical solutions to the 'big'
measurement problem, we take the problem to arise from the failure to see the
significance of Hilbert space as a new kinematic framework for the physics of
an indeterministic universe, in the sense that Hilbert space imposes kinematic
(i.e., pre-dynamic) objective probabilistic constraints on correlations between
events.Comment: 25 pages; for 'Everett @ 50,' S. Saunders, J. Barrett, A. Kent, D.
Wallace (eds.), Oxford, 2009. Revised version involves some clarification in
the formulation and minor correction
New Bell inequalities for the singlet state: Going beyond the Grothendieck bound
Contemporary versions of Bell's argument against local hidden variable (LHV)
theories are based on the Clauser Horne Shimony and Holt (CHSH) inequality, and
various attempts to generalize it. The amount of violation of these
inequalities cannot exceed the bound set by the Grothendieck constants.
However, if we go back to the original derivation by Bell, and use the perfect
anti-correlation embodied in the singlet spin state, we can go beyond these
bounds. In this paper we derive two-particle Bell inequalities for traceless
two-outcome observables, whose violation in the singlet spin state go beyond
the Grothendieck constants both for the two and three dimensional cases.
Moreover, creating a higher dimensional analog of perfect correlations, and
applying a recent result of Alon and his associates (Invent. Math. 163 499
(2006)) we prove that there are two-particle Bell inequalities for traceless
two-outcome observables whose violation increases to infinity as the dimension
and number of measurements grow. Technically these result are possible because
perfect correlations (or anti-correlations) allow us to transport the indices
of the inequality from the edges of a bipartite graph to those of the complete
graph. Finally, it is shown how to apply these results to mixed Werner states,
provided that the noise does not exceed 20%.Comment: 18 pages, two figures, some corrections and additional references,
published versio
Macroscopic objects in quantum mechanics: A combinatorial approach
Why we do not see large macroscopic objects in entangled states? There are
two ways to approach this question. The first is dynamic: the coupling of a
large object to its environment cause any entanglement to decrease
considerably. The second approach, which is discussed in this paper, puts the
stress on the difficulty to observe a large scale entanglement. As the number
of particles n grows we need an ever more precise knowledge of the state, and
an ever more carefully designed experiment, in order to recognize entanglement.
To develop this point we consider a family of observables, called witnesses,
which are designed to detect entanglement. A witness W distinguishes all the
separable (unentangled) states from some entangled states. If we normalize the
witness W to satisfy |tr(W\rho)| \leq 1 for all separable states \rho, then the
efficiency of W depends on the size of its maximal eigenvalue in absolute
value; that is, its operator norm ||W||. It is known that there are witnesses
on the space of n qbits for which ||W|| is exponential in n. However, we
conjecture that for a large majority of n-qbit witnesses ||W|| \leq O(\sqrt{n
logn}). Thus, in a non ideal measurement, which includes errors, the largest
eigenvalue of a typical witness lies below the threshold of detection. We prove
this conjecture for the family of extremal witnesses introduced by Werner and
Wolf (Phys. Rev. A 64, 032112 (2001)).Comment: RevTeX, 14 pages, some additions to the published version: A second
conjecture added, discussion expanded, and references adde
Geometry of quantum correlations
Consider the set Q of quantum correlation vectors for two observers, each
with two possible binary measurements. Quadric (hyperbolic) inequalities which
are satisfied by every vector in Q are proved, and equality holds on a two
dimensional manifold consisting of the local boxes, and all the quantum
correlation vectors that maximally violate the Clauser, Horne, Shimony, and
Holt (CHSH) inequality. The quadric inequalities are tightly related to CHSH,
they are their iterated versions (equation 20). Consequently, it is proved that
Q is contained in a hyperbolic cube whose axes lie along the non-local
(Popescu, Rohrlich) boxes. As an application, a tight constraint on the rate of
local boxes that must be present in every quantum correlation is derived. The
inequalities allow testing the validity of quantum mechanics on the basis of
data available from experiments which test the violation of CHSH. It is noted
how these results can be generalized to the case of n sites, each with two
possible binary measurements.Comment: Published version, slight change in titl
New optimal tests of quantum nonlocality
We explore correlation polytopes to derive a set of all Boole-Bell type
conditions of possible classical experience which are both maximal and
complete. These are compared with the respective quantum expressions for the
Greenberger-Horne-Zeilinger (GHZ) case and for two particles with spin state
measurements along three directions.Comment: 10 page