24 research outputs found
A Stronger Theorem Against Macro-realism
Macro-realism is the position that certain "macroscopic" observables must
always possess definite values: e.g. the table is in some definite position,
even if we don't know what that is precisely. The traditional understanding is
that by assuming macro-realism one can derive the Leggett-Garg inequalities,
which constrain the possible statistics from certain experiments. Since quantum
experiments can violate the Leggett-Garg inequalities, this is taken to rule
out the possibility of macro-realism in a quantum universe. However, recent
analyses have exposed loopholes in the Leggett-Garg argument, which allow many
types of macro-realism to be compatible with quantum theory and hence violation
of the Leggett-Garg inequalities. This paper takes a different approach to
ruling out macro-realism and the result is a no-go theorem for macro-realism in
quantum theory that is stronger than the Leggett-Garg argument. This approach
uses the framework of ontological models: an elegant way to reason about
foundational issues in quantum theory which has successfully produced many
other recent results, such as the PBR theorem.Comment: Accepted journal version. 10 + 7 pages, 1 figur
No -epistemic model can fully explain the indistinguishability of quantum states
According to a recent no-go theorem (M. Pusey, J. Barrett and T. Rudolph,
Nature Physics 8, 475 (2012)), models in which quantum states correspond to
probability distributions over the values of some underlying physical variables
must have the following feature: the distributions corresponding to distinct
quantum states do not overlap. This is significant because if the distributions
do not overlap, then the quantum state itself is encoded by the physical
variables. In such a model, it cannot coherently be maintained that the quantum
state merely encodes information about underlying physical variables. The
theorem, however, considers only models in which the physical variables
corresponding to independently prepared systems are independent. This work
considers models that are defined for a single quantum system of dimension ,
such that the independence condition does not arise. We prove a result in a
similar spirit to the original no-go theorem, in the form of an upper bound on
the extent to which the probability distributions can overlap, consistently
with reproducing quantum predictions. In particular, models in which the
quantum overlap between pure states is equal to the classical overlap between
the corresponding probability distributions cannot reproduce the quantum
predictions in any dimension . The result is noise tolerant, and an
experiment is motivated to distinguish the class of models ruled out from
quantum theory.Comment: 5+5 page
Does a Computer have an Arrow of Time?
In [Sch05a], it is argued that Boltzmann's intuition, that the psychological
arrow of time is necessarily aligned with the thermodynamic arrow, is correct.
Schulman gives an explicit physical mechanism for this connection, based on the
brain being representable as a computer, together with certain thermodynamic
properties of computational processes. [Haw94] presents similar, if briefer,
arguments. The purpose of this paper is to critically examine the support for
the link between thermodynamics and an arrow of time for computers. The
principal arguments put forward by Schulman and Hawking will be shown to fail.
It will be shown that any computational process that can take place in an
entropy increasing universe, can equally take place in an entropy decreasing
universe. This conclusion does not automatically imply a psychological arrow
can run counter to the thermodynamic arrow. Some alternative possible explana-
tions for the alignment of the two arrows will be briefly discussed.Comment: 31 pages, no figures, publication versio
Revisiting the Gaia Hypothesis: Maximum Entropy, Kauffman’s ‘Fourth Law’ and Physiosemeiosis
Recently, Kleidon suggested to analyze Gaia as a non-equilibrium
thermodynamic system that continuously moves away from equilibrium, driven by
maximum entropy production which materializes in hierarchically coupled
mechanisms of energetic flows via dissipation and physical work. I relate this
view with Kauffman's 'Fourth Law of Thermodynamics', which I interprete as a
proposition about the accumulation of information in evolutionary processes.
The concept of physical work is expanded to including work directed at the
capacity to work: I offer a twofold specification of Kauffman's concept of an
'autonomous agent', one as a 'self-referential heat engine', and the other in
terms of physiosemeiosis, which is a naturalized application of Peirce's theory
of signs. The conjunction of these three theoretical sources, Maximum Entropy,
Kauffman's Fourth Law, and physiosemeiosis, shows that the Kleidon restatement
of the Gaia hypothesis is equivalent to the proposition that the biosphere is
generating, processing and storing information, thus directly treating
information as a physical phenomenon. There is a fundamental ontological
continuity between the biological processes and the human economy, as both are
seen as information processing and entropy producing systems. Knowledge and
energy are not substitutes, with energy and information being two aspects of
the same underlying physical process
Quantum- vs. macro- realism: what does the Leggett-Garg inequality actually test?
Macroscopic Realism (MR) says that a macroscopic system is always determinately in one or other of the macroscopically distinguishable states available to it. The Leggett-Garg (LG) inequality was derived to allow experimental test of whether or not this doctrine is true; it is also often thought of as a temporal version of a Bell-inequality. Despite recent interest in the inequality, controversy remains regarding what would be shown by its violation. Here we resolve this controversy, which arises due to an insufficiently general and model-independent approach to the question so far. We argue that LG's initial characterisation of MR does not pick out a particularly natural realist position, so we articulate an operationally well-defined and well-motivated position in its place. We show that much weaker conditions than LG's are sufficient to derive the inequality: in the first instance, its violation only demonstrates that certain measurements fail to be non-disturbing at the operational level. We articulate three distinct species of MR-ist position, and argue that it is only the first of these which can be refuted by LG inequality violation. This first position is an attractive one, so ruling it out remains of interest, however. A crucial role is played in LG's argument by the assumption of noninvasive measurability. We show that this notion is ambiguous between the weaker notion of disturbance at the operational level, and the stronger notion of invasiveness at the ontic level of properties of the system. Ontic noninvasiveness would be required to rule out MR per se but this property is not entailed by MR, and its presence cannot be established in a model-independent way. It follows that despite the formal parallels, Bell's and LG's inequalities are not methodologically on a par. We close with some reflections on the implications of our analysis for the pedagogy of quantum superposition
Information and entropy in quantum theory
Available from British Library Document Supply Centre- DSC:DXN060836 / BLDSC - British Library Document Supply CentreSIGLEGBUnited Kingdo
Jordan’s Derivation of Blackbody Fluctuations
The celebrated Dreimännerarbeit by Born, Heisenberg and Jordan contains a matrix-mechanical derivation by Jordan of Planck’s formula for blackbody fluctuations. Jordan appears to have considered this to be one of his finest contributions to quantum theory, but the status of his derivation is puzzling. In our Dreimenschenarbeit, we show how to understand what Jordan was doing in the double context of a Boltzmannian approach to statistical mechanics and of the early ‘statistical interpretation’ of matrix mechanics