412 research outputs found
EPR, Bell, and Quantum Locality
Maudlin has claimed that no local theory can reproduce the predictions of
standard quantum mechanics that violate Bell's inequality for Bohm's version
(two spin-half particles in a singlet state) of the Einstein-Podolsky-Rosen
problem. It is argued that, on the contrary, standard quantum mechanics itself
is a counterexample to Maudlin's claim, because it is local in the appropriate
sense (measurements at one place do not influence what occurs elsewhere there)
when formulated using consistent principles in place of the inconsistent
appeals to "measurement" found in current textbooks. This argument sheds light
on the claim of Blaylock that counterfactual definiteness is an essential
ingredient in derivations of Bell's inequality.Comment: Minor revisions to previous versio
The Definition of Mach's Principle
Two definitions of Mach's principle are proposed. Both are related to gauge
theory, are universal in scope and amount to formulations of causality that
take into account the relational nature of position, time, and size. One of
them leads directly to general relativity and may have relevance to the problem
of creating a quantum theory of gravity.Comment: To be published in Foundations of Physics as invited contribution to
Peter Mittelstaedt's 80th Birthday Festschrift. 30 page
A quantum logical and geometrical approach to the study of improper mixtures
We study improper mixtures from a quantum logical and geometrical point of
view. Taking into account the fact that improper mixtures do not admit an
ignorance interpretation and must be considered as states in their own right,
we do not follow the standard approach which considers improper mixtures as
measures over the algebra of projections. Instead of it, we use the convex set
of states in order to construct a new lattice whose atoms are all physical
states: pure states and improper mixtures. This is done in order to overcome
one of the problems which appear in the standard quantum logical formalism,
namely, that for a subsystem of a larger system in an entangled state, the
conjunction of all actual properties of the subsystem does not yield its actual
state. In fact, its state is an improper mixture and cannot be represented in
the von Neumann lattice as a minimal property which determines all other
properties as is the case for pure states or classical systems. The new lattice
also contains all propositions of the von Neumann lattice. We argue that this
extension expresses in an algebraic form the fact that -alike the classical
case- quantum interactions produce non trivial correlations between the
systems. Finally, we study the maps which can be defined between the extended
lattice of a compound system and the lattices of its subsystems.Comment: submitted to the Journal of Mathematical Physic
Unsharp Quantum Reality
The positive operator (valued) measures (POMs) allow one to generalize the notion of observable beyond the traditional one based on projection valued measures (PVMs). Here, we argue that this generalized conception of observable enables a consistent notion of unsharp reality and with it an adequate concept of joint properties. A sharp or unsharp property manifests itself as an element of sharp or unsharp reality by its tendency to become actual or to actualize a specific measurement outcome. This actualization tendency-or potentiality-of a property is quantified by the associated quantum probability. The resulting single-case interpretation of probability as a degree of reality will be explained in detail and its role in addressing the tensions between quantum and classical accounts of the physical world will be elucidated. It will be shown that potentiality can be viewed as a causal agency that evolves in a well-defined way
The development of path integration: combining estimations of distance and heading
Efficient daily navigation is underpinned by path integration, the mechanism by which we use self-movement information to update our position in space. This process is well-understood in adulthood, but there has been relatively little study of path integration in childhood, leading to an underrepresentation in accounts of navigational development. Previous research has shown that calculation of distance and heading both tend to be less accurate in children as they are in adults, although there have been no studies of the combined calculation of distance and heading that typifies naturalistic path integration. In the present study 5-year-olds and 7-year-olds took part in a triangle-completion task, where they were required to return to the startpoint of a multi-element path using only idiothetic information. Performance was compared to a sample of adult participants, who were found to be more accurate than children on measures of landing error, heading error, and distance error. 7-year-olds were significantly more accurate than 5-year-olds on measures of landing error and heading error, although the difference between groups was much smaller for distance error. All measures were reliably correlated with age, demonstrating a clear development of path integration abilities within the age range tested. Taken together, these data make a strong case for the inclusion of path integration within developmental models of spatial navigational processing
Typical local measurements in generalised probabilistic theories: emergence of quantum bipartite correlations
What singles out quantum mechanics as the fundamental theory of Nature? Here
we study local measurements in generalised probabilistic theories (GPTs) and
investigate how observational limitations affect the production of
correlations. We find that if only a subset of typical local measurements can
be made then all the bipartite correlations produced in a GPT can be simulated
to a high degree of accuracy by quantum mechanics. Our result makes use of a
generalisation of Dvoretzky's theorem for GPTs. The tripartite correlations can
go beyond those exhibited by quantum mechanics, however.Comment: 5 pages, 1 figure v2: more details in the proof of the main resul
The final stages of slip and volcanism on an oceanic detachment fault at 13°48âČN, MidâAtlantic Ridge
Author Posting. © American Geophysical Union, 2018. This article is posted here by permission of American Geophysical Union for personal use, not for redistribution. The definitive version was published in Geochemistry, Geophysics, Geosystems 19 (2018): 3115-3127, doi:10.1029/2018GC007536.While processes associated with initiation and maintenance of oceanic detachment faults are becoming better constrained, much less is known about the tectonic and magmatic conditions that lead to fault abandonment. Here we present results from nearâbottom investigations using the submersible Alvin and autonomous underwater vehicle Sentry at a recently extinct detachment fault near 13°48âČN, MidâAtlantic Ridge, that allow documentation of the final stages of fault activity and magmatism. Seafloor imagery, sampling, and nearâbottom magnetic data show that the detachment footwall is intersected by an ~850 mâwide volcanic outcrop including pillow lavas. Saturation pressures in these vesicular basalts, based on dissolved H2O and CO2, are less than their collection pressures, which could be explained by eruption at a shallower level than their present depth. Subâbottom profiles reveal that sediment thickness, a loose proxy for seafloor age, is ~2 m greater on top of the volcanic terrain than on the footwall adjacent to the hangingâwall cutoff. This difference could be explained by currentâdriven erosion in the axial valley or by continued slip after volcanic emplacement, on either a newly formed or preâexisting fault. Since current speeds near the footwall are unlikely to be sufficient to cause significant erosion, we favor the hypothesis that detachment slip continued after the episode of magmatism, consistent with growing evidence that oceanic detachments can continue to slip despite hosting magmatic intrusions.National Science Foundation (NSF) Grant Numbers: OCEâ1259218, OCEâ1260578, OCEâ17365472019-03-1
Dynamical Semigroup Description of Coherent and Incoherent Particle-Matter Interaction
The meaning of statistical experiments with single microsystems in quantum
mechanics is discussed and a general model in the framework of non-relativistic
quantum field theory is proposed, to describe both coherent and incoherent
interaction of a single microsystem with matter. Compactly developing the
calculations with superoperators, it is shown that the introduction of a time
scale, linked to irreversibility of the reduced dynamics, directly leads to a
dynamical semigroup expressed in terms of quantities typical of scattering
theory. Its generator consists of two terms, the first linked to a coherent
wavelike behaviour, the second related to an interaction having a measuring
character, possibly connected to events the microsystem produces propagating
inside matter. In case these events breed a measurement, an explicit
realization of some concepts of modern quantum mechanics ("effects" and
"operations") arises. The relevance of this description to a recent debate
questioning the validity of ordinary quantum mechanics to account for such
experimental situations as, e.g., neutron-interferometry, is briefly discussed.Comment: 22 pages, latex, no figure
A geometrical origin for the covariant entropy bound
Causal diamond-shaped subsets of space-time are naturally associated with
operator algebras in quantum field theory, and they are also related to the
Bousso covariant entropy bound. In this work we argue that the net of these
causal sets to which are assigned the local operator algebras of quantum
theories should be taken to be non orthomodular if there is some lowest scale
for the description of space-time as a manifold. This geometry can be related
to a reduction in the degrees of freedom of the holographic type under certain
natural conditions for the local algebras. A non orthomodular net of causal
sets that implements the cutoff in a covariant manner is constructed. It gives
an explanation, in a simple example, of the non positive expansion condition
for light-sheet selection in the covariant entropy bound. It also suggests a
different covariant formulation of entropy bound.Comment: 20 pages, 8 figures, final versio
Quantitative wave-particle duality and non-erasing quantum erasure
The notion of wave-particle duality may be quantified by the inequality
V^2+K^2 <=1, relating interference fringe visibility V and path knowledge K.
With a single-photon interferometer in which polarization is used to label the
paths, we have investigated the relation for various situations, including
pure, mixed, and partially-mixed input states. A quantum eraser scheme has been
realized that recovers interference fringes even when no which-way information
is available to erase.Comment: 6 pages, 4 figures. To appear in Phys. Rev.
- âŠ